Hi was wanting some insight into using bulk inserts into Oracle. i have an old legacy system that has a flat table structure that need convert into multiple insert rows into the database of the new system. currently this is being done through the oracle insert component, this however is extremely slow and i've seen that there is oracle bulk inserts. it seems however a csv needs to be output first before it can be bulk inserted. i have a few questions for anyone who have gone through a similar excercise, 1, any suggestions/tips for the fastest way to do a transform of this type? 2, is there any speed gain by outputing a csv by replacing the oracle output block and inserting via the bulk oracle insert (rather than the single insert at a time)?
On the first question: Output the data to a tOracleOutputBulk component. Use tOracleBulkExec to load the data into the Oracle table. In this component, set the columns you want to import in the component's schema. You may need to get special Oracle privileges to be allowed to use the Oracle SQL*Loader utility. On the second question. Bulk inserting to a SQL Server table, even including ftp'ing the file to the SQL Server box, was 3 to 4 times faster than single inserts.
without showing you the complex job i'm using would you be able to know if it's possible to replace the existing oracle output block with a tOracleOutputBulk? does this then ouptu csv files? i'm a little worried about the complexities of this as the resulting output files are going to be anything up to 500m rows
tOracleOutputBulk does indeed output a csv file, so you'll have to make sure you have enough disc space for your 500m rows. The details of how the file is structured shouldn't matter because the file will be immediately read by the tOracleBulkExec component. If I were you, I would just try this out with 100 records or so and see what happens. Have a play.