Oracle Load errors out after 10000 records

One Star

Oracle Load errors out after 10000 records

Hi
I have a simple load from a flat file into a remote Oracle database. The first 10k records fly into the database but the load errors out afterwards
Starting job LoadHUBAccounts at 15:12 23/03/2009. connecting to socket on port 3827
connected
Exception in component tOracleOutput_1
java.lang.ArrayIndexOutOfBoundsException: -32303
at oracle.jdbc.driver.OraclePreparedStatement.setupBindBuffers(OraclePreparedStatement.java:2673)
at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10689)
at custody_plus.loadhubaccounts_0_1.LoadHUBAccounts.tFileInputPositional_1Process(LoadHUBAccounts.java:3260)
at custody_plus.loadhubaccounts_0_1.LoadHUBAccounts.runJobInTOS(LoadHUBAccounts.java:3434)
at custody_plus.loadhubaccounts_0_1.LoadHUBAccounts.main(LoadHUBAccounts.java:3343)
disconnected
I'm assuming this is an Oracle driver issue but not sure.
Cheers
Al
One Star

Re: Oracle Load errors out after 10000 records

Check your advanced settings tab - there is likely a setting similar to "Commit Every XXXX rows"
If that's the case and it's set to 10,000 - we're on to something.
Also, Do those first 10,000 records actually get loaded to the database? Have you confirmed they're actually in the table?
One Star

Re: Oracle Load errors out after 10000 records

I raised the number if records before commit to 100k and all my rows inserted no problem. The file had 65k records so it never reached the commit point.
Previously with the commit point at 10k the first 10k records inserted and then the job errored out as above.
I'll try to re-create with the larger commit size at 100k.
Not applicable

Re: Oracle Load errors out after 10000 records

Hi,
I have the same problem in a job inserting in an oracle table.
If I set the commit interval between 5000 and 20 000, the job abort after the second commit with this error :
Exception in component tOracleOutput_1
java.lang.ArrayIndexOutOfBoundsException
at oracle.jdbc.driver.OraclePreparedStatement.setupBindBuffers(OraclePreparedStatement.java:2673)
at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10689)
at dev_nsa.fichier_flabe_xml_0_1.Fichier_FLABE_XML.tFileInputDelimited_1Process(Fichier_FLABE_XML.java:11179)
at dev_nsa.fichier_flabe_xml_0_1.Fichier_FLABE_XML.runJobInTOS(Fichier_FLABE_XML.java:13402)
at dev_nsa.fichier_flabe_xml_0_1.Fichier_FLABE_XML.main(Fichier_FLABE_XML.java:13304)

If I set the commit interval to 2000 or 50 000, it works. (slowly, but it works)
I use a tOracleOutput with an Oracle SID connexion and Oracle 10.
With Talend Open Studio 3.1.0.RC1_r23203.
Thank you,
Nicolas
Not applicable

Re: Oracle Load errors out after 10000 records

For information :
The problem was solved in the last version of TOS : 3.1.0.r24382.
Nicolas
One Star

Re: Oracle Load errors out after 10000 records

This error has re-appeared! I was running Jobs in Open Studio V4.2.2 and the Jobs were working correctly. However, my management asked me to investigate the MDM tool therefore I downloaded MDM CE V4.2.3 r67267 and installed this. I migrated my Jobs into the new version and tried to run them. They errored. On investigation the error was exactly as described in this post. The fail only happens with UPDATE clause in a tOracleOutput. When set to 10 000 rows it errors as above, I raised it to 1 000 000 rows and the fault disappeared and the Job completed successfully. This is a real pain as to test the Jobs in the MDM environment I now have to rework all the affected Jobs to take account of this.
One Star

Re: Oracle Load errors out after 10000 records

I managed to circumvent this bug not by increasing the size of the commit but in contrary to make it much smaller (10 rows). I have a process with over 500k rows so this is quite slow but this could be another solution to this bug that continues to appear in Version: 5.4.1, Build id: r111943-20131212-1133 (community edition).
I tried to increase the commit to 1M rows but I hit Javaheap issue there, this is why I just tried to make my commits smaller.
Seventeen Stars

Re: Oracle Load errors out after 10000 records

It is a miracle to me. We output data to an Oracle database and usually work with a batch size of 10k and commit size 10k and it works well.
It is not a bug from Talend it is an bug of the Oracle JDBC driver. You should check if you can replace the driver or update it.
Here an outdated discussion which shows, this error happens to other developer too:
https://community.oracle.com/message/2255186
One Star

Re: Oracle Load errors out after 10000 records

This is also a problem in 5.5 running on SQL Server 2014.  The solution is to select the output component advanced settings and disable "Use Batch Size".  The commit size does not matter.
One Star

Re: Oracle Load errors out after 10000 records

Using Oracle 11g and Talend Open Studio Build id: V5.6.1_20141207_1530 :cool:, I found that setting your commit to 1000 as opposed to the standard 10,000 does the trick and imports all data in.  Not sure why, just tried it (based on prior comment) and confirmed on several data streams that this is the case.
One Star

Re: Oracle Load errors out after 10000 records

java.lang.ArrayIndexOutOfBoundsException
We are upgrading from 5.1 to 5.6.1 and I could not figure out what this tOracleOutput error was until I found this thread.  The issue was that Oracle components that used cursor setting needed to be changed to 1000 and then they worked.  Thank you oscarvalles!
One Star

Re: Oracle Load errors out after 10000 records

Try switching to the ojdbc7 (instead of ojdbc6) jar, since it's a newer version that does not appear to have this problem.
You might have to edit your connection .properties file with a text editor to switch it from ojdbc6-11 to ojdbc7-12, then go into talend and "modify" it without changing anything, when you hit finish it will prompt to update all jobs that use it.
(FYI, the ojdbc7-12 driver is compatible with oracle 11g as well as 12c, probably 10g too, but I hope you aren't using that)
Four Stars

Re: Oracle Load errors out after 10000 records

Same problem here loading several rows into an Oracle table and using JDBC driver ojdbc6.jar
Exception in component tOracleOutput_1
java.lang.ArrayIndexOutOfBoundsException: -32467
Buliding the job to run it out of TOS DI (replacing ojdbc6.jar in lib folder with ojdbc7.jar and adjusting the batch file to start the job), the problem is solved, no more errors.
Now, how can I tell Talend (TOS DI) to use ojdbc7.jar instead of ojdbc6 in all tOracle* components and without using tOracleConnection as I need to specify the commit interval? In this way I could run it inside TOS.
One Star

Re: Oracle Load errors out after 10000 records

Try switching to the ojdbc7 (instead of ojdbc6) jar, since it's a newer version that does not appear to have this problem.
You might have to edit your connection .properties file with a text editor to switch it from ojdbc6-11 to ojdbc7-12, then go into talend and "modify" it without changing anything, when you hit finish it will prompt to update all jobs that use it.
(FYI, the ojdbc7-12 driver is compatible with oracle 11g as well as 12c, probably 10g too, but I hope you aren't using that)

Hi, Can you explain me where can I find .properties file to switch it from ojdbc6-11 to ojdbc7-12 please?
One Star

Re: Oracle Load errors out after 10000 records

Get new Oracle ojdbc6.jar from Oracle 11.2.0.4, it has this problem fixed:
Copy it to c:/talend/TOS_BD-20150908_1633-V6.0.1/workspace/<your project>/temp/dbWizard and replace old one there.
Andrew
One Star

Re: Oracle Load errors out after 10000 records

Link has been removed from the previous post, so google "oracle ojdbc6.jar" and select first link.
Andrew
One Star

Re: Oracle Load errors out after 10000 records

Thank you oscarvalles!
IT IS SOLVED BY SETTING COMMIT EVERY :1000 AS OPPOSE TO STANDARD 10,000 IN ADVANCED SETTINGS OF TORACLEOUTPUT COMPONENT VIEW.