Four Stars

GC Overhead limit Exceeded

Hi All

I am new to Talend and supporting production system. Today in encounter below error in a job. 

This error is coming when t_verticainput reading the data from a table and dumping into a file. 

Would you please give some direction

Here is some part of job where error is coming ( I guess)

 

GC Overhead Error.PNG

java.lang.OutOfMemoryError: GC overhead limit exceeded
at com.vertica.io.ProtocolStream.readMessage(Unknown Source)
at com.vertica.dataengine.VResultSet.fetchChunk(Unknown Source)
at com.vertica.dataengine.VResultSet.moveToNextRow(Unknown Source)
at com.vertica.dataengine.VResultSet.closeCursor(Unknown Source)
at com.vertica.dataengine.VResultSet.close(Unknown Source)
at com.vertica.jdbc.common.SForwardResultSet.close(Unknown Source)
at com.vertica.jdbc.common.SStatement.clearResults(Unknown Source)
at com.vertica.jdbc.common.SStatement.close(Unknown Source)
at edw_bi_552.dmp_data_export_v2_2_0.DMP_Data_Export_V2.tVerticaInput_3Process(DMP_Data_Export_V2.java:9840)
at edw_bi_552.dmp_data_export_v2_2_0.DMP_Data_Export_V2$2.run(DMP_Data_Export_V2.java:8870)
java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.Arrays.copyOf(Arrays.java:2219)
at java.util.ArrayList.grow(ArrayList.java:242)
at java.util.ArrayList.ensureExplicitCapacity(ArrayList.java:216)
at java.util.ArrayList.ensureCapacityInternal(ArrayList.java:208)
at java.util.ArrayList.add(ArrayList.java:440)
at com.vertica.util.TypeUtils.serialize(Unknown Source)
at com.vertica.io.CopyDataRequestMessage.sendParamData(Unknown Source)
at com.vertica.io.CopyDataRequestMessage.send(Unknown Source)
at com.vertica.io.ProtocolStream.sendMessage(Unknown Source)
at com.vertica.dataengine.VQueryExecutor.sendCopyData(Unknown Source)
at com.vertica.dataengine.VQueryExecutor.handleExecuteResponse(Unknown Source)
at com.vertica.dataengine.VQueryExecutor.execute(Unknown Source)
at com.vertica.jdbc.common.SPreparedStatement.executeBatch(Unknown Source)
at com.vertica.jdbc.VerticaJdbc4PreparedStatementImpl.executeBatch(Unknown Source)
at edw_bi_552.dmp_data_export_v2_2_0.DMP_Data_Export_V2.tOracleInput_4Process(DMP_Data_Export_V2.java:12430)
at edw_bi_552.dmp_data_export_v2_2_0.DMP_Data_Export_V2.tOracleInput_3Process(DMP_Data_Export_V2.java:11838)
at edw_bi_552.dmp_data_export_v2_2_0.DMP_Data_Export_V2.tVerticaRow_5Process(DMP_Data_Export_V2.java:11499)
at edw_bi_552.dmp_data_export_v2_2_0.DMP_Data_Export_V2.tVerticaRow_4Process(DMP_Data_Export_V2.java:11362)
at edw_bi_552.dmp_data_export_v2_2_0.DMP_Data_Export_V2.tJava_20Process(DMP_Data_Export_V2.java:11236)
at edw_bi_552.dmp_data_export_v2_2_0.DMP_Data_Export_V2.tOracleRow_2Process(DMP_Data_Export_V2.java:10388)
at edw_bi_552.dmp_data_export_v2_2_0.DMP_Data_Export_V2.tOracleRow_5Process(DMP_Data_Export_V2.java:10244)
at edw_bi_552.dmp_data_export_v2_2_0.DMP_Data_Export_V2$3.run(DMP_Data_Export_V2.java:8900)

9 REPLIES
Five Stars

Re: GC Overhead limit Exceeded

Hello,    

    You can try to adjust the JVM parameters when the job is executed;JVM - In the Run View > Advanced settings tab, there is a JVM Setting. If you select the Use specific  JVM arguments, you can add command line arguments to your JVM. Several arguments can increase the  amount of memory available to your Job. For example, you may want to increase the initial and maximum memory size for the JVM.

ORDX7IP3~PEO~L[I3OJGTO7.png

 regards

Annie

 

Employee

Re: GC Overhead limit Exceeded

1) Can you share how much memory allocated to this job ?

2) what is the status of job server : how much total memory, free memory on the server ?

3) Approx how many rows you are processing ?

4) You got this issue at particular time or always ? trying to understand if that time, server is loaded or what..

Five Stars

Re: GC Overhead limit Exceeded

1) If you set the memory size exceeds the physical memory, the job will not run. My memory settings up to-Xmx8192M. xms set the initial memory pool size,
- xmx Set the size of the largest memory pool. You can try different sets of JVM parameters depending on your order of magnitude
2) The amount of data I currently process is about 12,000,000 rows,
3) When dealing with the amount of data is too large will have this problem

Employee

Re: GC Overhead limit Exceeded

Is it right to assume that previously job was running fine. Whenever data size is too high it fail with that error ?

Employee

Re: GC Overhead limit Exceeded

Can you attached the job here, to have a look if some design changes can overcome this issue.

Four Stars

Re: GC Overhead limit Exceeded

I have attached the screen shot of the job which is causing the issue. The component which is giving GC error is tverticaInput component which is running a query. It is monthly job which pulls data for previous month. May be the volume is more which is causing this issue.

I am in a situation where i can't increase memory further.

Four Stars

Re: GC Overhead limit Exceeded

Is there any other JVM setting accept increasing memory. Because i can't increase the memory. I tried UseConcMarkSweepGC. But still it failed after finishing 80% of the work. 

Employee

Re: GC Overhead limit Exceeded

I have worked on vertica, might be we can fine tune your monthly query? Is query using proper projections, partitions? All table are properly analyze ? Can you run your vertica query in database designer with update statistics
Employee

Re: GC Overhead limit Exceeded

is there any filter or aggregation that you perform after fetching data from Vertica? I am not sure about your business logic but is it possible to apply the filter or aggregation in your vertica query, if possible ?