Dynamically define context variable values from flat file in Big data spark job

Six Stars

Dynamically define context variable values from flat file in Big data spark job

Hi,

 

I am trying to update context variable on run time from flat file. It is a big data spark job.

 

 

 

job.png

 

I am reading csv file where it has context variable and their value. With tJavaRow component, I am assigning context variable value. The same approach is working fine with Standard job but not with Big data job.

 

Where I am wrong in this approach? How we can assign values to context variables in Big data spark job during run time from flat file?

 

Thanks..

 

 

Moderator

Re: Dynamically define context variable values from flat file in Big data spark job

Hi,

For BD batch jobs, the Talend recommendation is to use a DI job 'launcher' that loads the "dynamic" context.

So far, there is also no tContextLoad component for Big Data Batch and Spark Streaming.

Best regards

Sabrina

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.

What’s New for Talend Spring ’19

Watch the recorded webinar!

Watch Now

Agile Data lakes & Analytics

Accelerate your data lake projects with an agile approach

Watch

Definitive Guide to Data Quality

Create systems and workflow to manage clean data ingestion and data transformation.

Download

Tutorial

Introduction to Talend Open Studio for Data Integration.

Watch