Hi to everyone,
i'm writing here to ask if someone had the some problem as myself. I have a talend job runnind on an EC2 instance, that is used to copy a large number of rows from Postgres DB. My problem is that, even if i raise the WCU (to a maximum of 5000), the real writing speed does not exceed the writing speed i had with 10 WCU.
Is it possible to create parallel jobs or different streams in the same job, so that each of them processes a different chunk of data?
Has someone encountered the same issue? Any tips on the matter?
Thank you in advance.
Did you try to run parallel jobs to extract various parts of data using tparallelize component? You can attache multiple tDBInput component to it so that each of them will run in parallel.
Please add additional memory to make sure that Talend job will have sufficient memory to run the parallel process.
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved :-)
Slow writing speed is very annoying when transferring large data to AWS.
After comparing various cloud management platforms to choose the best for the job, not getting
optimal transfer performance can be disappointing.
One way I increase transfer speed is by adding additional memory and running parallel jobs.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Accelerate your data lake projects with an agile approach
Create systems and workflow to manage clean data ingestion and data transformation.
Introduction to Talend Open Studio for Data Integration.