I have generated XML using Talend Data mapper. XML of multi schema nested type.
There are 110 columns for a records. I have used your approach using 2 maps.
First Map convert flat file structure to XML structure with duplicates. Second Map use first map output as input and generate final XML.
When processing 200 records its working fine. But as volume increase First map file size also increase exponentially. After running for long hours job get failed.
Please suggest if any Idea. Thank you for your time and efforts.
Could you please specify the memory parameters you are using for the job? Did you measure the performance difference for various memory values?
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved :-)
Job is failing due to out of memory Error. It seems JVM heap size full.
I have executed job for 8GB JVM(Run tab--> Advance setting(-Xmx4096)) but still job failed for 200 records only.
We have enterprise edition 7.1.1 and also raised a ticket for this issue to Talend. But still not resolved.
Here I am looking for some community experts who can help.
Following error received when job failed.
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
Hi @Rajender ,
Please specify your -Xmx as 5120M or 5G.
If you miss the "M" the memory will not be calculated as MB.
In addition you can read the file as stream and give it a try.
Thanks and Regards,
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Learn how to do cool things with Context Variables
Find out how to migrate from one database to another using the Dynamic schema
Pick up some tips and tricks with Context Variables