Is there any way to convert from Standard job to Big Data Batch jobs. I tried to move a Standard job into Big Data Batch folder, but since few components were not available the job didnt get imported successfully.
Components not available in Big Data batch framework: tHDFS configuration, tHive Connection, tHive Input/Output.
Pl let me know if there is a way to use the same components & convert into Big Data batch job?
You are able to create a Job by converting it from a different framework, such as from Standard to MapReduce or from MapReduce to Spark. This is an advisable option if the components used in a source Job are also available to the target Job.
Please have a look at this online user guide:TalendHelpCenter:Converting Jobs
Here exists a jira issue:https://jira.talendforge.org/browse/TUP-15693
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Learn how to make your data more available, reduce costs and cut your build time
Read about OTTO's experiences with Big Data and Personalized Experiences
Take a look at this video about Talend Integration with Databricks