large csv file import

Seven Stars

large csv file import

I have many large files, each one contains more than 2 Mil rows of .CVS data.

I need to import gradually into MySQL table because I cannot load data in CVS file into memory.

Could you guide me which should use to do that?

I try to use tFileInputDelimited component, but it take out all file data and consume all my lap memory

Tags (1)
Forteen Stars

Re: large csv file import

@phancongphuoc ,if you are getting memory error ,you can increase the JVM by using the below link.

 

https://community.talend.com/t5/Installing-and-Upgrading/Configure-to-use-a-JVM/td-p/112893 

 

you can try to use the tMySqlBulkExec component to improve the performance. to know more about tMySqlBulkExec find the below link.

 

https://help.talend.com/reader/jomWd_GKqAmTZviwG_oxHQ/YhYqawgnulVXpdzE1lJ6cg

Manohar B
Don't forget to give kudos/accept the solution when a replay is helpful.
Sixteen Stars TRF
Sixteen Stars

Re: large csv file import

You may also have a loop over your tFileInputDelimited with the Limit parameter setted to the max value your are able to manage at the same time (for example 250,000) and set the Header parameter dynamically based on the loop indice to ignore previously imported rows.

You may also split your big file into small chunck files and iterate over the list of created files.

If you connect tFileInputFullRow to tFileOutputDelimited set the advanced parameter "Split output in several files" to the number of lines your are able to manage at once, it should be an easier solution.


TRF
Seven Stars

Re: large csv file import

Hello Manohar,

I have just start with Talend around 3 hours, could you please tell me which one should I use between TOS_DI or TOS_BD for my case? 

Seven Stars

Re: large csv file import

Dear TFR,

It seem your suggestion an very appropriate to my case.

Could you guide me some steps to do that? I am very new to TOS

Great thanks

Seven Stars

Re: large csv file import

Dear TRF

 

Screenshot is what I am trying to do (connect tFileInputFullRow to tFileOutputDelimited)

But how to config the tFileInputFullRow because it require the File Name?

 

Sixteen Stars TRF
Sixteen Stars

Re: large csv file import

The job design should be like this:

tFileInputFullRow --> tFileOutputDelimited (just to create small files - set schema to a single field)

|

onSubJobOK

|

tFileList --> tFileInputDelimited (with the real schema) --> tMap --> IHSDatabase


TRF
Seven Stars

Re: large csv file import

Dear TRF

I got the OutOfMemmories error message

I still wonder tFileInputFullRow should take row by row in the tFileInputDelimited should be more sense. Am I right?

 

Capture1.PNG

Forteen Stars

Re: large csv file import

@phancongphuoc ,increase the JVM and see.

 follow the below link to set JVM.

 

https://community.talend.com/t5/Installing-and-Upgrading/Configure-to-use-a-JVM/td-p/112893 

 

Manohar B
Don't forget to give kudos/accept the solution when a replay is helpful.
Seven Stars

Re: large csv file import

Hi Manohar ,

I am using TOS 64 bit and I have no problem with Java as in your link

Still confuse on what you mentioned about

Forteen Stars

Re: large csv file import

@phancongphuoc ,check the below link .

https://community.talend.com/t5/Design-and-Development/Memory-issues-when-profiling-large-data-sets-...

Manohar B
Don't forget to give kudos/accept the solution when a replay is helpful.
Seven Stars

Re: large csv file import

Hi Manohar,
Thanks for your suggestion. I try it now.
But do we have a solution for increasing import as TRF mentioned above? Mean that we don't load all rows in file to RAM, instead we insert a number of rows then continue loading other rows.
Sixteen Stars TRF
Sixteen Stars

Re: large csv file import

@phancongphuoc your job design doesn't make sense.

Here is what I suggest to you:

job.png

With 1rst subjob, you'll split your big file into smallest CSV file, thanks to the "Split output in several files" option.

For example, you can generate files of 100,000 records.

tFileInputFullRow consider the input file with 1 single field called "line". Use the same for the tFileOutputDelimited and don't include header for the output files.

For the 2nd subjob, use tFileList component to iterate over the list of previously generated CSV files.

tFileInputDelimited let you read each CSV file one by one with the desired schema.

You have to use the following expression for the filename:

((String)globalMap.get("tFileList_1_CURRENT_FILEPATH"))

The content of the current file is pushed to the database by the tMysqlOutput component.


TRF
Seven Stars

Re: large csv file import

Hi TRF,

 

I try your approach, but it seem that TFileInputFullRow component read all rows in the file before it seperate into several files.

This is mean that it is load all the file content into RAM

So that, I still got the error of "Exception in thread "main" java.lang.OutOfMemoryError: Java heap space " in even that first step of seperating into serveral files

Capture.PNG

I also try to increase the JAVA heap to > 3GH

Capture2.PNG

All those thing cannot be done

 

 

 

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Best Practices for Using Context Variables with Talend – Part 1

Learn how to do cool things with Context Variables

Blog

Migrate Data from one Database to another with one Job using the Dynamic Schema

Find out how to migrate from one database to another using the Dynamic schema

Blog

Best Practices for Using Context Variables with Talend – Part 4

Pick up some tips and tricks with Context Variables

Blog