I have developed around 150 jobs for a big project, most of them are reading information from Oracle and then loading the data into HDFS (Staging Area), now I do not want to continue doing the same thing over and over again.
I am looking for a way to have1 job that does everything since the tasks is repetitive, and then using some sort of metadata to achieve the goal of loading the data to early stages.
If there are no major transformation involved then there are multiple ways to design this job.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Pick up some tips and tricks with Context Variables
Take a look at this video about Talend Integration with Databricks
Learn how<SPAN>to modernize your Cloud Platform for Big Data Analytics with Talend and Microsoft Azure</SPAN>