java.lang.OutOfMemoryError: GC overhead limit exceeded

Five Stars

java.lang.OutOfMemoryError: GC overhead limit exceeded

Hi I am trying to load data from sybase input in which in select query i am doing join and fetching around 6 million records and using tmap trying to load it in sybase table using tsybase output.I am not able to load and it getting struck with just 1 row and then when i tried to increase batch size then it go till 1lakh records and giving error.

Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.lang.StringCoding$StringDecoder.decode(Unknown Source)
at java.lang.StringCoding.decode(Unknown Source)
at java.lang.String.<init>(Unknown Source)
at java.lang.String.<init>(Unknown Source)
at com.sybase.jdbc3.utils.PureConverter.toUnicode(Unknown Source)

 

Please suggest


Accepted Solutions
Employee

Re: java.lang.OutOfMemoryError: GC overhead limit exceeded

@rahuljan 

 

Have you selected the Die on Error condition in your tSybaseOutput component?

 

The batch size is used for committing the records and if you are not providing the value, it will take the default commit size.

 

In your case, why are you not using the Bulk components to load the data considering your huge data volume?

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved :-)


Warm Regards,
Nikhil Thampi
Please appreciate our members by giving Kudos for spending their time for your query. If your query is answered, please mark the topic as resolved :-)

All Replies
Forteen Stars

Re: java.lang.OutOfMemoryError: GC overhead limit exceeded

@rahuljan ,you have JVM issue,sicne you need to increase a JVM properties. check the below link

 

https://community.talend.com/t5/Design-and-Development/resolved-OutOfMemoryError-GC-overhead-limit-e...

Manohar B
Don't forget to give kudos/accept the solution when a replay is helpful.
Employee

Re: java.lang.OutOfMemoryError: GC overhead limit exceeded

Hi,

 

    2 cents from my end will be to use the disk space to store the temporary data in tMap.

 

    Please refer the below link for details.

 

https://help.talend.com/reader/EJfmjmfWqXUp5sadUwoGBA/J4xg5kxhK1afr7i7rFA65w

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved :-)


Warm Regards,
Nikhil Thampi
Please appreciate our members by giving Kudos for spending their time for your query. If your query is answered, please mark the topic as resolved :-)
Five Stars

Re: java.lang.OutOfMemoryError: GC overhead limit exceeded

@nikhilthampi @manodwhb Even after using store on disk property it pass 10000 row and stop the process. 100000 is default batch size set in tsybase output.Please suggest

Forteen Stars

Re: java.lang.OutOfMemoryError: GC overhead limit exceeded

@rahuljan ,could you please let me know what parameters are you using -jms and -Jmx

Manohar B
Don't forget to give kudos/accept the solution when a replay is helpful.
Five Stars

Re: java.lang.OutOfMemoryError: GC overhead limit exceeded

@manodwhb  @nikhilthampi  i am using xms256m and xmx1024. The real problem is when i am enabling batch size in output component as 10000 the job stopped working after reading 10000 recrords. and if i disabled it it start working till 1 lakh records. Why batch size is restrincing records after 10 K any idea?

Employee

Re: java.lang.OutOfMemoryError: GC overhead limit exceeded

@rahuljan 

 

Have you selected the Die on Error condition in your tSybaseOutput component?

 

The batch size is used for committing the records and if you are not providing the value, it will take the default commit size.

 

In your case, why are you not using the Bulk components to load the data considering your huge data volume?

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved :-)


Warm Regards,
Nikhil Thampi
Please appreciate our members by giving Kudos for spending their time for your query. If your query is answered, please mark the topic as resolved :-)

What’s New for Talend Spring ’19

Watch the recorded webinar!

Watch Now

Agile Data lakes & Analytics

Accelerate your data lake projects with an agile approach

Watch

Definitive Guide to Data Quality

Create systems and workflow to manage clean data ingestion and data transformation.

Download

Tutorial

Introduction to Talend Open Studio for Data Integration.

Watch