I have a scenario where i have to load like 200 target tables from 200 input files having one-to-one mapping. each file structure is different.
I also have an extra column in the input files having values as 'I'/'U'/'D' depending on which i will Insert/Update/Delete the records of the target tables.
now the simple solution is to write 200 different jobs but i'm trying to design this using one single dynamic job.
problems i am facing:
1. need to change the schema of tfileinputdelimited dynamically.
2. need to write a metadata file/table for the primary keys of different tables for update/delete.
3. these primary keys will also have to change dynamically.
being a novice in the world of talend i will really appreciate any help.
many thanks in advance.
I have thought of the same. I will load the files in temporary tables. Also I will write a context variable to store metadata for source and target table and primary keys.
But how can I control the insert/update flow without specifying schema?
For update i will need individual field names for different tables along with primary keys.
Also for insert i will need to specify column names as my source temporary table have two extra field for DML details. therefore, won't be able to use simple bulk load.
i have 200 tables from one server which i want to load in another 200 tables in different server. the one way to do is can make 200 jobs and run it. But is there any other way to do like can it be feasible to make a single dynamic job through which we can load all the tables. Waiting for a reply.
Watch the recorded webinar!
Introduction to Talend Open Studio for Data Integration.
Test drive Talend's enterprise products.
Practical steps to developing your data integration strategy.