Hi We are trying to read large amount of data and loading into CSV file. This job uses "Dynamic Schema" as its a generic job triggered for more than one table. It was giving java.lang.OutOfMemoryError: Java heap space error. So, we have enabled streaming in tMySQLInput. Now, the job gives the following error. Exception in component tMysqlInput_1 java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@311671b2 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries. at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:936) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:933) at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2697) at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1868) at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2113) at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2562) at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2512) at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1476) at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:2948) at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:593) at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:447) at Please see the attached image. This job snippet is part of Subjob. Thanks, Sangi
Hi This is a part of subjob triggered by many times? It seems it open DB connection many times. It is a common Mysql problem and many people have issued it, the solution is to disable the option 'Enable stream'. If possible, can you please export your job and send it to me via email, I will see whether we can optimize or re-design your job, to resolve the OutOfMemoryError or the Mysql error. Shong
---------------------------------------------------------- Talend | Data Agility for Modern Business
Hi Shong, The main and sub job connects to different databases. The main job prepare the standard sql query and pass it to the sub job. SubJob triggers the query on a different database. The earlier attached image is the entire sub job. It just reads all the data and loads it to the flat file. That is all. The only problem I see is, the MySQLInput component uses dynamic schema and enable stream. Looks like, this combination will not work together. I will see, if I can export and send the job it you. Thank you, sangi
If you disable the option "Enable Stream" the driver will collect all data into the memory and after finishing the collection, it delivers the rows. This causes often OutOfHeapSpace problems. Dynamic schema and the option enable stream has normally nothing to do with each other. Dynamic schema means, the schema columns will be retrieved from the resultset meta data. If this does not work together with getting resultset meta data, try to use an new JDBC driver using the tLoadLibrary component BEFORE the subjob in which the MySQL components will be loaded.
I don't know if this is relevant to this exact case, but I stumbled on this thread while I my self had a similar error "Streaming result set com.mysql.jdbc.RowDataDynamic@XXXXXXXX is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries." The last subjob (which also I think caused the error) in my case read from a temp table and wrote to the final target table, both in the same DB. They both used the same Connection. Creating another connection, and letting the reading of the temp table being separate from the connection writing of to the final target table solved my error, and enabled me to keep the "Enable Stream" on.