Six Stars

Dynamically define context variable values from flat file in Big data spark job

Hi,

 

I am trying to update context variable on run time from flat file. It is a big data spark job.

 

 

 

job.png

 

I am reading csv file where it has context variable and their value. With tJavaRow component, I am assigning context variable value. The same approach is working fine with Standard job but not with Big data job.

 

Where I am wrong in this approach? How we can assign values to context variables in Big data spark job during run time from flat file?

 

Thanks..

 

 

  • Big Data
  • Data Integration
  • SDI
Tags (1)
1 REPLY
Moderator

Re: Dynamically define context variable values from flat file in Big data spark job

Hi,

For BD batch jobs, the Talend recommendation is to use a DI job 'launcher' that loads the "dynamic" context.

So far, there is also no tContextLoad component for Big Data Batch and Spark Streaming.

Best regards

Sabrina