Add incremental number to output file

Six Stars

Add incremental number to output file

Hi everyone,


is it possible to get outfiles with an incremental number?


Let's say we have a job and this job will create the file output_001.csv

If we start the job again then we'll get output_002.csv and if we start the job 3rd time then output_003.csv etc.


Thanks for any advice.


Re: Add incremental number to output file



    If you are storing the last file number in a location (it can be file, DB etc), you can read it from there to a context variable (say running_number).


     While writing the file, you can add the file name in the output as "output_"+context.running_number+".csv"


    Once the data output to file is complete, you can write the new output back to the area where you are storing last output file number. In this way, it will work fine. I would suggest to use a table in DB to store this control information.


Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved :-)



Eleven Stars

Re: Add incremental number to output file

Below  Approach  would help


1) keep a File having last run sequence in your local disk . Read this file at the start of job and use it to create your next run file. Update the same file for next run sequence.

(Note : If you are using Context file , you can simply update new sequence in that file)


2) You could use system Variable instead of File . Get/Set using value tSystem and tSetenv 


3) using tFileFetch , get the latest created file name in the output folder .

    Get the last part from string so for output_003.csv it would be 003 

    Add 1 to it , use new value to create Output.



Abhishek KUMAR


Talend named a Leader.

Get your copy


Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Migrate Data from one Database to another with one Job using the Dynamic Schema

Find out how to migrate from one database to another using the Dynamic schema


Put Massive Amounts of Data to Work

Learn how to make your data more available, reduce costs and cut your build time

Watch Now

How OTTO Utilizes Big Data to Deliver Personalized Experiences

Read about OTTO's experiences with Big Data and Personalized Experiences