loading data into Hive parque table - Java Heap Space Issue

One Star

loading data into Hive parque table - Java Heap Space Issue

Hi,
   Talend BigData Job - with Hadoop Dev Cluster@ my client place.
   I am loading data into hive table with Parque format using tHiveRow Component, While loading 400 MB of data, i am getting Java Heap Space Error.
If i load 250 MB Data, it works fine, if its large it failing with Jave Heap Space error
Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
: local_project.job_eng_4000_salesinvoice_supplychain_0_1.job_Eng_4000_SalesInvoice_SupplyChain - tHiveRow_1 - Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask   
Hadoop Log
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
2016-10-12 18:50:06,793 INFO org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_COMPLETED for container container_e140_1475190707594_3000_01_000002 taskAttempt attempt_1475190707594_3000_m_000000_0

 
Moderator

Re: loading data into Hive parque table - Java Heap Space Issue

Hi,
Could you please indicate on which build version you got this issue?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: loading data into Hive parque table - Java Heap Space Issue

Hi,
  i am getting  Talend BigData 6.2 version.
Thanks
Suresh