Hi, As Shong has mentioned, you can use the 'Split output in several files' option in the Advanced Settings tab of the tFileOutputDelimited. This can ease the processing load, if you have a very large file. A setup may be: tFileInputDelimited --main--> tFileOutputDelimited --on subjob ok--> tFileList --iterate--> further processing (to DB, etc.) (your very large file) (split file on every (loop through you 100000 rows) splitted files) The tFileInputDelimited is meant for CSV files. There are similar components for Excel files.