[resolved] How to improve performance for the job?

One Star

[resolved] How to improve performance for the job?

I have a job to load data from flat file to oracle DB. The flat file is processed many times, once per each dimension (to insert new dimensions), once to load fact. Each dimension process uses tMap to get unique new dimension and tOracleOutput to insert records. Last fact process also uses tMap to get dimension Ids and then tOracleOutput to insert new ones. 
The performance is ok when there is no/little data in DB. But it downgrades sharply as more data in DB. The flat file is from 50M to 1G, with 50K records at least. Any suggestions on improving the performance?
What's the best practice for this kind of data loading to keep a good performance?

Re: [resolved] How to improve performance for the job?

Have you tried to check out "Use Batch " option in toracleOutput component to activate the batch mode for data processing?
Would you mind posting your job design screenshots into forum?
Best regards
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Four Stars

Re: [resolved] How to improve performance for the job?

Hi hjiang,
 Can you try to load with tOracleoutputBulkExec as an output.

Re: [resolved] How to improve performance for the job?

This thread should move to the Open Studio forum here
One Star

Re: [resolved] How to improve performance for the job?

Thanks Siva and Sabrina. I will test out and let you know the result later. 

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Best Practices for Using Context Variables with Talend – Part 4

Pick up some tips and tricks with Context Variables


How Media Organizations Achieved Success with Data Integration

Learn how media organizations have achieved success with Data Integration


Agile Data lakes & Analytics

Accelerate your data lake projects with an agile approach


Definitive Guide to Data Quality

Create systems and workflow to manage clean data ingestion and data transformation.