rRunJob - Calling Standard Job from Spark Job Running in Yarn Cluster Mode ( MapR Cluser)

0 Kudos

rRunJob - Calling Standard Job from Spark Job Running in Yarn Cluster Mode ( MapR Cluser)

Spark job in Running in yarn Cluster Mode in MapR Cluster.

On SubJobOK , at the end of job , I am invoking a Standard Job. 

I am getting  class no found error. Upon looking log , it seems that the jar containing the sub jar is being referred from user cache. Path is something like …

 

/tmp/hadoop-mapruat/nm-local-dir/usercache/useId/appcache/application_343434343434_1567/container_e8375037937593695474_1567_02_000001/subJobTest.jar

 

 

Exact same Job works fine if run the Main Spark Job in Yarn Client mode

 

Spark Job – Running in Yarn Cluster Mode – Calling Standard Job thru tRunJob - > Invocation of Sub Job in Failing

 

Spark Job – Running in Yarn Client Mode – Calling Standard Job thru tRunJob - > Invocation of Sub Job  is working fine.

 

Question – Is the subjob invocation from Spark-Yarn-Mode supported in Talend ? if not what is the work around to invoke Standard  job from Spark Job running in yarn Cluster Mode.

1 Comment
Five Stars

Please comment on this