I am trying to push data from one SQLserver table to another. I have the stats and logs setup to catch the stats, logs and volumetrics data to database. We are exporting the job and executing the .sh file to run the job through a different scheduler.
I can get details like start of job or when a component execution started but if the job is running for longer, is there a way to determine how many rows have been processed at any particular time? Using "monitoring a connection", the number of rows processed is only updated once the job is complete.
Is it possible to have the information in the logs(rows per second) similar to what is displayed on Talend Studio when I run the job from studio with statistics enabled under advanced settings of Run tab. This would be really helpful in identifying the bottlenecks in the jobs running in production.
The tFlowMeterCatcher component catches the processing volumetric from the tFlowMeter component and passes them on to the output component,
which will count the no. of records pass by the specify flow.
Let us know if it is what you are looking for.
Thank you for the reply Sabrina. I am already using the setting "monitor a connection" on row link which works similar to tFlowMeter. The processing volumetrics is only updated once the subjob execution is complete.
Is there a way to determine the speed at which the rows are getting processed when the subjob execution is in progress?
Here is a subscription based feature real-time statistics which displays and collects all Job execution statistics in Talend Administration Center.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Part 2 of a series on Context Variables
Learn how to do cool things with Context Variables
Find out how to migrate from one database to another using the Dynamic schema