One Star

tDenormalize cause OutOfMemory in combination with large files.

Hello All,
There is a DI Job created by me and it loads 4 million rows from a ascii text. Then these data flow over to tMap and afterwards flow further to component tDenormalize. The memory consumption for this job is extraordinary high. I have created a java heap dump file and I can see one culprit of others. tDenormalize component respectivley its objects resident in a hugh amount inside the heap. Has someone any idea to avoid the use of tDenormalize or make it more efficiency? The amounts of HashMap objects are also very high and these ones take actually the most of the heap.

Is it true Talend requires no savings in memory but instead insisted of limitless memory disposition?
Kind regards
Hilderich
2 REPLIES
Four Stars

Re: tDenormalize cause OutOfMemory in combination with large files.

I know that if the "lookup" files are large, like thousands of rows, those values will be placed into memory. You may have done this but if not, increase the JVM memory of the job.  This can be changed on the "Run" tab, on left select 'Advanced Settings" and check "Use Specific JVM arguments" and double-click each to increase the memory allowed to be used.  The Max memory is the bottom one and I usually limit this to 4GB like "-Xmx4096"     
Moderator

Re: tDenormalize cause OutOfMemory in combination with large files.

Hi,
Have you checked KB article TalendHelpCenter:ExceptionOutOfMemory
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.