OutOfMemoryError

One Star

OutOfMemoryError

Hi,
I am having a job which will move data from sybase ASE to sybase IQ,
in this mapping another table is joined and a look is performed with that table, in that look up table 9 Million records,
while running this job we are facing the below mentioned issue
java.lang.OutOfMemoryError: GC overhead limit exceeded


Thanks in Advance
Siva
Moderator

Re: OutOfMemoryError

Hi siva,
Did you check the response in Forum 31749 to see if it is OK with you?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Seventeen Stars

Re: OutOfMemoryError

hi all,
GC overhead limit means that your Garbage collector work "too hard & and too Often".
OPtimize your job & increase (if possible) your XmX (JVM)
Have a look at
http://www.oracle.com/technetwork/java/javase/gc-tuning-6-140523.html#par_gc.oom

regards
laurent
One Star

Re: OutOfMemoryError

I have Optimized the job by storing the data in local disk and it is working fine now..

Thanks
Siva

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Best Practices for Using Context Variables with Talend – Part 1

Learn how to do cool things with Context Variables

Blog

Migrate Data from one Database to another with one Job using the Dynamic Schema

Find out how to migrate from one database to another using the Dynamic schema

Blog

Best Practices for Using Context Variables with Talend – Part 4

Pick up some tips and tricks with Context Variables

Blog