Garbage Collection: Sub Jobs and General Best practice

Highlighted
Seven Stars

Garbage Collection: Sub Jobs and General Best practice

Does the community suggest running java garbage collection after executing subjobs or is there a better way to deal with those errors when its not possible to add more hardware to the solution?

Highlighted
Moderator

Re: Garbage Collection: Sub Jobs and General Best practice

Hi,

Did you get "java.lang.OutOfMemoryError: GC overhead limit" issue?

The problem is that when failing with the “java.lang.OutOfMemoryError: GC overhead limit exceeded” error JVM is signalling that your application is spending too much time in garbage collection with little to show for it.

Usually, increasing the heap size with -Xmx parameter can give a quick fix for this kind of issue.

It is recommended that you check or profile the job design. Performance issue usually cause by the DB connection or the job design.

Hope this link will help:https://plumbr.eu/outofmemoryerror

Best regards

Sabrina

 

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.

2019 GARTNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

Best Practices for Using Context Variables with Talend – Part 1

Learn how to do cool things with Context Variables

Blog

Migrate Data from one Database to another with one Job using the Dynamic Schema

Find out how to migrate from one database to another using the Dynamic schema

Blog

Best Practices for Using Context Variables with Talend – Part 4

Pick up some tips and tricks with Context Variables

Blog