an issue of partial insertion of data into the target when any job fails.

Five Stars

an issue of partial insertion of data into the target when any job fails.

We have 17 records data set in one of the source tables in which we have erroneous data in the 14th record, which causes the job failure. Then, in the target only 10 records would be inserted as the commit size given as “10” in the mysqloutput component and the job failed. In the next execution after correcting the error record, job will fetch all the 17 records with successful execution. Due to which there will be duplicates in the target.

 

We tried :
To overcome this, we have tried with tmysqlrollback component in which we have included the tmysqlconnection and tmysqlcommit components. Is there any other option to use tmysqlrollback without using the tmysqlconnection and tmysqlcommit components? 

Q2 :  Also, We'd like to know about the RAM usage and disk space consumption from the performance perspective. Could you please help us in this regard as well?

Eight Stars

Re: an issue of partial insertion of data into the target when any job fails.

Hello,

 

Q1: Change the commit size to 1 (not necessary) and use "Insert and Update", "Insert Ignore" or anything else which suits your situation for the Action on data parameter.

https://help.talend.com/reader/4I8tDQGtrOPDl5MXAS3Q~w/aDNKleHXlevILu9pnbCoNg

 

Q2: Check this component. You can then combine information from its output with the OS monitoring and see the resource consumption.
https://help.talend.com/reader/iYcvdknuprDzYycT3WRU8w/uNrRheNaNfWzKpfGyiwHcQ

 

Regards

Lojdr

Five Stars

Re: an issue of partial insertion of data into the target when any job fails.

Where we can see resource (cpu,  RAM , NETWORK ) consumption process wise in talend enterprise for big data. is it possible to watch it from  AMC console ?