Use external ( implict ) context or repository context ?

Four Stars

Use external ( implict ) context or repository context ?

I am new to Talend and trying to figure out a best approach in defining context variables . I am using Big data edition with TAC . I am planning to define all hadoop cluster parameters & connection details  through contexts . Dilemma is around whether to use implicit context or repository context ( Dev , Test , Prod ) 

Here are some of the questions :

a) If I am using external or implicit context , how to to run the big data job as stand alone ? Because big data jobs does not have "Extra" tab to load implicit context . I am using a standard job --> using implicit context load in the job --> pass those variables to child big data job 

b) How to secure password if using implicit context ? Any user that has access to file on Talend server would be able to see the password 

c) If we go with repository context - DEV / TEST / PROD - Do I need to change anything when I deploy the job in different environment & try to run ? I want it to be seamless and don't want to touch the job 

d) If we go with repository context - DEV / TEST / PROD - Is password secured & encrypted ? Looking at some of the posts , it seems like password will get stored in some file . Please provide file name if it's true 

e) what is the best practice & pros & cons ?

Thanks in advance Smiley Happy


Four Stars

Re: Use external ( implict ) context or repository context ?

Any insights into this ?


Talend named a Leader.

Get your copy


Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

How OTTO Utilizes Big Data to Deliver Personalized Experiences

Read about OTTO's experiences with Big Data and Personalized Experiences


Best Practices for Using Context Variables with Talend – Part 4

Pick up some tips and tricks with Context Variables


Talend Integration with Databricks

Take a look at this video about Talend Integration with Databricks

Watch Now