Jobs fail in Talend Cloud fairly regularly (a few times a week) and not the same jobs or databases, with database connection errors. Sometimes, they are Oracle reset errors (see below A.) This particular Oracle environment is on premise. Other times, they are Postgres errors (see below B). This Postgresql db is on AWS, and the error seems like the connection variables were not passed properly. Why does this happen? The databases are up and reachable, and if the job is rerun, it almost always is fine. However, it causes execution plans to fail and subsequent jobs not to run. It's a production support drain.
A) Exception in component tOracleInput_1 java.sql.SQLRecoverableException: IO Error: Connection reset at oracle.jdbc.driver.T4CConnection.logon
B) Exception in component tPostgresqlInput_1 (GL7113C_Flex_Values) java.sql.SQLException: No suitable driver found for jdbcostgresql://:/ at java.sql.DriverManager.getConnection(DriverManager.java:689) at java.sql.DriverManager.getConnection(DriverManager.java:247) at financial.gl7113c_flex_values_0_1.GL7113C
from first issue about Oracle connection is not clear what could be the problem. We need more info / logs if possible.
But second one is clear. We have known issue: sometimes during task execution context parameters from another context are taken (or just taken 'null' values). We are working now on the fix and expect to update production in comming days.
Thank you. Actually, usually the Oracle connection error is a null pointer exception, so I think that may be the same problem as what we see with Postgresql. Will the update be cloud only and seamless to users or will there be a new remote engine version also?
How can we know when this has been put into production?
yes, looks like with Oracle also the same issue.
we are working on fix now. it's still not clear how to fix it but I think it will require new version of remote engine.
Introduction to Talend Open Studio for Data Integration.
Practical steps to developing your data integration strategy.
Create systems and workflow to manage clean data ingestion and data transformation.