Add unique identifer while merging the files

Highlighted
Four Stars

Add unique identifer while merging the files

Hello Friends,

 

I am doing data integration project using Talend. I have a requirement to merge multiple country feeds to one. Every file corresponds to one country, but the file doesn't have any identifier inside the file.

It can be identified by file name which is unique per country. I used tFileList to merge multiple files, but not finding a way to add unique identifier to every file, so that I can identify country on merged feed.

Example:-

en-US, en-CA, fr-CA are the feeds

 

is there a way to do that?

 

Regards,

Mahesh

Community Manager

Re: Add unique identifer while merging the files

When you read the file you will be using a tFileList variable to get the filename and path. You can use 

((String)globalMap.get("tFileList_1_CURRENT_FILE")) to retrieve the filename and then add your identifier column using that data.

Four Stars

Re: Add unique identifer while merging the files

I used tFielList --> tFileInputDelimited --> tFileOutputDelimited.

I see variable ((String)globalMap.get("tFileList_1_CURRENT_FILEPATH")), but not seeing any option to add that with expression to new column

Do I need to use any other component in the flow?

 

Thirteen Stars

Re: Add unique identifer while merging the files

Hi,

 

you can use tMap or tJavaRow for this, both allow you to add new column and assign a value to it

-----------
Four Stars

Re: Add unique identifer while merging the files

Thank you. tMap worked.

On tMap, I was looking for expression builder on left side columns which is input instead of right hand side columns.

Four Stars

Re: Add unique identifer while merging the files

Thank you tmap worked.

Cloud Free Trial

Try Talend Cloud free for 30 days.

Tutorial

Introduction to Talend Open Studio for Data Integration.

Definitive Guide to Data Integration

Practical steps to developing your data integration strategy.

Definitive Guide to Data Quality

Create systems and workflow to manage clean data ingestion and data transformation.