I need a small help to resolve the below issue. Actually, I am doing an incremental load by changing the few fields in tmap.
This is a large dataset.
Please find the attached job screenshot.
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
For sample data, it is working fine. Can someone guide me why it's giving this error for large dataset?
Have you tried to store your data on disk instead of memory when you are using tMap component in your work flow?
This exception means that the Job ran out of memory.
If there is a job that uses a large amount of memory, please try to set JVM parameters to your job.
For more information, please have a look at related documents:TalendHelpCenter:How to set advanced execution settings,
Thanks for the reply. I tried the same and its loading now.
I have a small question on "Have you tried to store your data on disk instead of memory when you are using tMap component in your workflow?"
I directly connected to the database and retrieved the schema. can you please tell me how to store this data on disk in talend.
Could you please elaborate this.