We are using Talend Data Integration 7.x. I have an issue with one of my requirement.
My pipeline consists of components tDBInput -> tMap -> tDBOutput. I have a huge data coming in from source tDBInput(around 10 million) , so I had to filter the data for example (1 to 1,000,000 & 1,000,001 to 2,000,000).
Can I process each filter sequentially such that it will not fail.
I tried to use row-->Iterate: unable to use row(iterate) from tDBInput to tMap.
Please suggest me on the better approach, I cant run in parallel as the data is huge and it might affect the server capacity.
The first 100 community members completing the Open Studio survey win a $10 gift voucher.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Pick up some tips and tricks with Context Variables
Learn how media organizations have achieved success with Data Integration
Create systems and workflow to manage clean data ingestion and data transformation.