I am creating a Job just reading from an oracle source (500,000 rows) and writing directly in a csv. It given me Timeout error after writng 50,000 lines, when i increased xms to 512m & xmx to 512g it gives below error message after reading all lines but it took around 2 hours to read all lnes. we have 48 gb RAM in the machine and no other users are there. when i am executing the DB source it's working fine and returning output in some seconds.
Any help will be appreciated.
Exception in thread "main" java.lang.OutOfMemoryError
at java.lang.AbstractStringBuilder.hugeCapacity(Unknown Source)
at java.lang.AbstractStringBuilder.newCapacity(Unknown Source)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(Unknown Source)
at java.lang.AbstractStringBuilder.append(Unknown Source)
at java.lang.StringBuilder.append(Unknown Source)
You can increase the heap from the Run->Advanced settings tab
If you see my earlier mail, i already did the same changes but still it's throwing error after reading all data.
More important its taking so much time to read the data.
Try Talend Cloud free for 30 days.
Introduction to Talend Open Studio for Data Integration.
Practical steps to developing your data integration strategy.
Create systems and workflow to manage clean data ingestion and data transformation.