One Star

Continue to next row in tMap on error

I have a tFileInputPositional -> tMap -> tMSSqlOutput sequence. The tMap takes the input (line by line) from tFileInputPositional and does a bunch of processing. Problem is, sometimes a row will contain bad data causing a java error in the tMap. I don't control the input files so I cannot ensure all rows are formatted properly. When the tMap fails on a row, I want it to just continue onto the next row instead of totally failing. Could somebody please explain how to do this? It should be something as simple as adding a try/catch block in the loop. Thanks.
2 REPLIES
Community Manager

Re: Continue to next row in tMap on error

Hello
There are two ways you can follow:
1)You should filter the records with some condiftion defined by you on tFilterRow or other component and check the data quality before tMap.
2)Put the tMap-->tMssqlOutput in a child job. eg:
on father job:
tFileInputPositional--->tFlowToIterate--tRunJob
on tRunJob: uncheck 'die on error' option, and pass each row to child job via tRunJob.
About how to pass a row/parameters to a child job, see 1918.
If you are working on TIS, you can use the joblet to define a child job and it is easy to pass a flow to child job.
on child job:
tFixedFlowInput-->tMap-->tMssqlOutput
on tFixedFlowInput: generate a input flow passed via tRunJob on father job.
Best regards

shong
----------------------------------------------------------
Talend | Data Agility for Modern Business
One Star

Re: Continue to next row in tMap on error

Hi, I have a similar issue as this. Here's what I'm trying to do:
- read through a tFileInputDelimited and process each row
- For each row....
---if the processing runs successfully, append the processed output row to a tFileOutputDelimited (success) object
---if the processing fails for whatever reason, append the original failing row to a different (failure) object
Eventually, for each input file to process, I should end up with either one full success file, or two files, one containing success records and the other containing failure records
I'm trying to do what shong mentioned above in terms of breaking the processing out to a subjob, but I'm having some issues:
- How do I append each processed record to the end of a file (whether that file already exists or needs to be created)?
- How come the various pieces of my flow are in separate light-blue boxes (separate jobs?) even when I want them all to be in the same one box?
- Even when I throw junk data into my subjob it still doesn't seem to then trigger the OnComponentError flow...it seems to continue to progress to the next step of the 'main' flow.
Thanks for all your help!
--Josh