I am trying to update context variable on run time from flat file. It is a big data spark job.
I am reading csv file where it has context variable and their value. With tJavaRow component, I am assigning context variable value. The same approach is working fine with Standard job but not with Big data job.
Where I am wrong in this approach? How we can assign values to context variables in Big data spark job during run time from flat file?
For BD batch jobs, the Talend recommendation is to use a DI job 'launcher' that loads the "dynamic" context.
So far, there is also no tContextLoad component for Big Data Batch and Spark Streaming.