Spark Job Configuration error

Four Stars

Spark Job Configuration error

I am getting an error in Big Data batch job for Spark to add hdp version in spark-env.sh. I am using hortonworks HDP2.5.3.0 with ambari 2.4.2.0 and it is a 4 node cluster. I have added the HDP version in spark-env.sh from ambari and restarted the spark. But it still shows the same issue. I have attched screen shots of my error, Spark Configuration, advanced settings of job and spark-env.sh content
Moderator

Re: Spark Job Configuration error

Hi,
Could you please also indicate on which build version you got this issue?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Highlighted
Seven Stars

Re: Spark Job Configuration error

You can usually find these in your YARN config in Ambari:
Example below where 2.3.2.0-2950 is your HDP version
spark.driver.extraJavaOptions="-Dhdp.version=2.3.2.0-2950"
spark.yarn.am.extraJavaOptions="-Dhdp.version=2.3.2.0-2950"
spark.hadoop.mapreduce.application.framework.path="$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure"
spark.hadoop.mapreduce.application.classpath="/hdp/apps/2.3.2.0-2950/mapreduce/mapreduce.tar.gz#mr-framework"
Four Stars

Re: Spark Job Configuration error

Thanks for your reply Justin. I tried the way you suggested but it still shows me the spark context did not initialize error.
 
I am using 6.3.1 version of Talend.
Seven Stars

Re: Spark Job Configuration error

Sorry, I'm unsure then, I've only set it up in 6.2 so far