Quantcast
Channel: Liquibase Forums
Viewing all articles
Browse latest Browse all 2993

Maximum csv file size for loadData

$
0
0
Hi, I'm trying to use Liquibase to import data into a large database table. I'm running with this command:

liquibase --driver=oracle.jdbc.OracleDriver ^
     --classpath="C:\tools\tomcat-ex\lib\oracle-jdbc-11.2.0.2.0.jar" ^
     --changeLogFile=abc-schema-base.xml ^
     --url="jdbc:oracle:thin:@abc.def:1521:hijklmnop" ^
     --username=abcdef ^
     --password=abcdef ^
     --logLevel=DEBUG ^
     update

The changeset is using loadData to import a csv file. The csv file I'm importing is 650Mb. Whenever I run the update, I get OutOfMemoryErrors (after a couple of hours of waiting).

Both the changeset and csv were generated with generateChangeLog. So full marks to Liquibase for at least being able to extract the large table!

I'm running with -Xmx2048m, and Liquibase 3.3.2.

I'm considering breaking the csv into multiple smaller files and multiple changesets, but I'm unsure how small the csv's would need to be. Does anyone have any info on the largest csv filesize Liquibase loadData can handle?

Has anyone else encountered and resolved this kind of problem?

Alternatively I can exclude this table from my Liquibase scripts as it's the only one causing issues.

Thanks in advance,
Adam.


Viewing all articles
Browse latest Browse all 2993

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>