Using the Implicit Context Load Feature for a Spark Job

Question

In Talend Studio, the Extra tab of a standard Job allows you to use the the Implicit Context Load feature to load context parameters dynamically at the time of Job execution.

 

standard_job.png

 

However, the Extra tab is not available for a Spark Job in Talend Studio:

spark2_job_noextra.png

 

So how can you pass context parameters and use the Implicit Context Load feature with a Spark Job?

 

Answer

The solution is to use a standard Job, configured to use the Implicit Context Load feature, that invokes the Spark Job using the tRunJob component.

 

To pass the context parameters to the Spark Job, configure the tRunJob component settings with the Use an independent process to run subjob and the Transmit whole context options checked. Below is the tRunJob component configuration to invoke the Spark Job (my_simple_spark_test):

trunjob.png

Version history
Revision #:
9 of 9
Last update:
‎12-06-2017 03:06 PM
Updated by:
 
Contributors
Tags (1)