I have fetched around 10 records from data base table (Table name “T1”) using some stored procedure. Now I want add these records in to another table (Table name “T2”) one by one. The target table "T2" in which these 10 records will be added have an identity field.
On addition of each of these records we need to update the identity field generated for each of these records from Table “T2” to table “T1”.
Can anyone please help me how to handle this?
Solved! Go to Solution.
I'm afraid you have to query the table T2 content after the insert has been achieve to retrieve the identity field value for every new record, the you can update the table T1.
If you are sure there is no risk of conflict with an other process authorized to insert records in table T2, you may also retrieve the current max value from existing rows before to inject new rows from your job.
In this case, you just have to manage a sequence on Talend side.
Suppose the current max value has been retrieved in the global variable called "maxId", you have to add a tMap or tJavaRow before the tOracleOutput component to compute the id value for each row with the following (here in a a tJavaRow):
output_row.Id = Numeric.sequence("seqId", (Integer)globalMap.get("maxId")+1, 1);
With this, the field called 'Id" is populated with the value of the sequence called "seqId" with is created with the current value + 1 and incremented by 1 for each row.
Now you can update the table T1 without having to query the table T2 again.
Hope this helps.
Technique mentioned by you won't work as there is risk of conflict with an other process authorized to insert records in table "T2".
I am using DoTNet DLL to insert rows in table "T2". Can't we return identity field value from DOTNET class being called to the JOB and then update that returned identity field value in table "T1"?
Can you tell me how this can be done, i.e. catch the value returned by DOTNET class and update that value in table "T1" from the job itself? This needs to be repeated for all records.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Part 2 of a series on Context Variables
Learn how to do cool things with Context Variables
Find out how to migrate from one database to another using the Dynamic schema