We are using Talend Data Integration 7.x. I have an issue with one of my requirement.
My pipeline consists of components tDBInput -> tMap -> tDBOutput. I have a huge data coming in from source tDBInput(around 10 million) , so I had to filter the data for example (1 to 1,000,000 & 1,000,001 to 2,000,000).
Can I process each filter sequentially such that it will not fail.
I tried to use row-->Iterate: unable to use row(iterate) from tDBInput to tMap.
Please suggest me on the better approach, I cant run in parallel as the data is huge and it might affect the server capacity.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Part 2 of a series on Context Variables
Learn how to do cool things with Context Variables
Find out how to migrate from one database to another using the Dynamic schema