I am new to Talend and have very short time in developing jobs. My apologies for asking basic questions.
My requirement is
1. To fetch data from excel files (might be 4 or more). Please refer the attached file. The file have 438 columns. Is there any issue in having so many columns.
2. First store all data in Staging table.
3. Then do the mapping and apply business rules. Post that load the data in respective tables.
4. I am loading data in to OLTP table. The flat file have a Bordeaux ID and Transaction Type. I need to first check the ID and then load the data. How can I use case statement. If the loaded data is having issues then delete only those data which are specific to a Bordeaux ID. So do I need to introduce a Separate Key.
5. I will receive many excel files so how can I automate I mean how can I trigger my job to load the file in staging table.
6. How can I automate my entire job.
Please guide me.
You can first of all create metadata in Talend using one sample excel file which will have your final format of the columns.
Then u can use the tFileList component to iterate over the excel files one by one from a particular folder.(make sure your files have same name with difference in seq number or something else to define it as unique
example : Excel_1.xlsx,Excel_2.xlsx,Excel_3.xlsx,etc)
so that in tFileList you can give filemask as Excel_*.xlsx
In tMap you can enable the expression editor and can write the case statements as required to filter your data (ex : row1.gender == 'F'?'female':'male') like this
so your flow should be something like this
tFileList ---> iterate ---> Excel_metadata_file ---> tMap ---> tDBOutput (staging table)
tfile_list -> iterate->tfileinput_excel (use the current file option from tfile list and check the dynamic column,no need to define columns) ->tmap(In tMap you can enable the expression editor and can write the case statements as required to filter your data (ex : row1.gender == 'F'?'female':'male') like this) ->toracle output