[resolved] How to improve performance for the job?

One Star

[resolved] How to improve performance for the job?

I have a job to load data from flat file to oracle DB. The flat file is processed many times, once per each dimension (to insert new dimensions), once to load fact. Each dimension process uses tMap to get unique new dimension and tOracleOutput to insert records. Last fact process also uses tMap to get dimension Ids and then tOracleOutput to insert new ones. 
The performance is ok when there is no/little data in DB. But it downgrades sharply as more data in DB. The flat file is from 50M to 1G, with 50K records at least. Any suggestions on improving the performance?
What's the best practice for this kind of data loading to keep a good performance?
Moderator

Re: [resolved] How to improve performance for the job?

Hi,
Have you tried to check out "Use Batch " option in toracleOutput component to activate the batch mode for data processing?
Would you mind posting your job design screenshots into forum?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Four Stars

Re: [resolved] How to improve performance for the job?

Hi hjiang,
 Can you try to load with tOracleoutputBulkExec as an output.
Thanks,
Siva.
Highlighted
Employee

Re: [resolved] How to improve performance for the job?

This thread should move to the Open Studio forum here
One Star

Re: [resolved] How to improve performance for the job?

Thanks Siva and Sabrina. I will test out and let you know the result later. 

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Best Practices for Using Context Variables with Talend – Part 2

Part 2 of a series on Context Variables

Blog

Best Practices for Using Context Variables with Talend – Part 1

Learn how to do cool things with Context Variables

Blog

Migrate Data from one Database to another with one Job using the Dynamic Schema

Find out how to migrate from one database to another using the Dynamic schema

Blog