passing a flow from parent job to child job

Five Stars

passing a flow from parent job to child job

I am building a spark job to process input file. I need to pass certain columns from a source feed to child job (trun or joblet) and apply some transformations. Is there a way to pass the whole column data in the flow? Is it possible to keep this data in spark memory from parent job and access that RDD from child job? 

Moderator

Re: passing a flow from parent job to child job

Hello,

Could you please let us know if this topic helps?

https://community.talend.com/t5/Design-and-Development/Passing-a-value-from-a-parent-Job-to-a-child-...

Best regards

Sabrina

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Five Stars

Re: passing a flow from parent job to child job

Thanks for quick reply. As I indicated, it is a spark job I am developing and there is no tFlowToIterate component in version 6.5. Are the tcache or RDD components available across the jobs?

Moderator

Re: passing a flow from parent job to child job

Hello,

The tCacheOut and tCacheIn components use RDD in the Spark Batch. Are you trying to pass value from Parent Spark job to child standard job?

Spark framework is way different and runs on many nodes. 

Best regards

Sabrina

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Five Stars

Re: passing a flow from parent job to child job

can cache components be shared between the jobs?

Moderator

Re: passing a flow from parent job to child job

Hello,

Are you trying to pass value from Parent Spark job to child standard job or pass value from Parent Spark job to child spark job?

Best regards

Sabrina

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Five Stars

Re: passing a flow from parent job to child job

I need to to both, passing data to child job, get it transformed and send the new data back to parent.

Moderator

Re: passing a flow from parent job to child job

Hello,

 If it is Spark Streaming (Talend Real Time Big Data), you can publish the data to kafka / JMS and pick it up in normal talend job.. If it is Spark batch, you could not do this. You may have to pass via a HDFS file.

Here is an article about:https://community.talend.com/t5/Architecture-Best-Practices-and/Spark-Dynamic-Context/ta-p/33038

Hope it will help.

Best regards

Sabrina

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Five Stars

Re: passing a flow from parent job to child job

Can we handle persistent RDDs from spark jobs? RDD that gets loaded in-memory all the time and accessible from various talend jobs. Please let me know how this functionality available in talend spark jobs.

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Put Massive Amounts of Data to Work

Learn how to make your data more available, reduce costs and cut your build time

Watch Now

How OTTO Utilizes Big Data to Deliver Personalized Experiences

Read about OTTO's experiences with Big Data and Personalized Experiences

Blog

Talend Integration with Databricks

Take a look at this video about Talend Integration with Databricks

Watch Now