I'm using Talend open studio v6.4. I'm running it on Windows 64-bit machine with 16 GB.
I have 3 jobs,in which I have jvm argument set as 1024 and 4096 for xms and xmx respectively.
All these 3 jobs have to processed 1-10 millions of data.
First 2 jobs ran successfully. But there was memory overflow issue while running 3rd job.
Do we need some extra workaround to release memory allocated by JVM after the job complete its run ? Or Talend release that memory itself ?
If you are processing huge Amount of data (having more no.of columns, more records and Joins) that made the Talend to store the processing of data in System memory. In order to get rid of this memory limitations there is a feature in tMap-LookUp Join where you can store the data temporarily in System Disk (Store on Disk - Option) rather that using the In-Memory for Processing the data.
So that you won't get the Memory Issues, next time you run your Job.
Are there some cache components consuming two much memory in your job. such as tMap, tUniqRow, tSortRow? For a large set of data, try to store the data on disk instead of memory on tMap, tUniqRow, tSortRow.
What does your work flow look like?