I'm trying to read an excel file into a postgres database. I dont get any errors on the tFileInputExcel_1 element and i can acces the database to select the table.
However when i run the job i get:
At least job "exlTransfer" has a compile errors, please fix and export again.
Error Line: 750Detail Message: pstmt_tPostgresqlOutput_1 cannot be resolved
There may be some other errors caused by JVM compatibility.
Make sure your JVM setup is similar to the studio. at org.talend.designer.runprocess.JobErrorsChecker.checkLastGenerationHasCompilationError(JobErrorsChecker.java:326) at org.talend.designer.runprocess.DefaultRunProcessService.checkLastGenerationHasCompilationError(DefaultRunProcessService.java:380)
.....
When i look at the generated code i see the variable pstmt_tPostgresqlOutput_1 is use in method tFileInputExcel_1Process() in line 750 but it has not been declared anywhere prior to this use. In other words the generated code does not compile.
I have tried starting a new job and defining the excel to postgres import a-new but with the same result. The same error is reported here:https://stackoverflow.com/questions/43662405/talend-csv-file-to-postgres As far as JVM compatibility is concerned, i'm using java 8, Talend version 6.4.1. How can i resolve this? I dont even know what type the pstmt_tPostgresqlOutput variable should have.
public void tFileInputExcel_1Process(
....
if (row1 != null) {
/**
* [tPostgresqlOutput_1 main ] start
*/
currentComponent = "tPostgresqlOutput_1";
// row1
// row1
if (execStat) {
runStat.updateStatOnConnection("row1"
+ iterateId, 1, 1);
}
whetherReject_tPostgresqlOutput_1 = false;
pstmt_tPostgresqlOutput_1.addBatch();
nb_line_tPostgresqlOutput_1++;
batchSizeCounter_tPostgresqlOutput_1++;
if ((batchSize_tPostgresqlOutput_1 > 0)
&& (batchSize_tPostgresqlOutput_1 <= batchSizeCounter_tPostgresqlOutput_1)) {
try {
int countSum_tPostgresqlOutput_1 = 0;
for (int countEach_tPostgresqlOutput_1 : pstmt_tPostgresqlOutput_1
.executeBatch()) {
countSum_tPostgresqlOutput_1 += (countEach_tPostgresqlOutput_1 < 0 ? 0
: countEach_tPostgresqlOutput_1);
}
insertedCount_tPostgresqlOutput_1 += countSum_tPostgresqlOutput_1;
batchSizeCounter_tPostgresqlOutput_1 = 0;
} catch (java.sql.BatchUpdateException e_tPostgresqlOutput_1) {
java.sql.SQLException ne_tPostgresqlOutput_1 = e_tPostgresqlOutput_1
.getNextException(), sqle_tPostgresqlOutput_1 = null;
String errormessage_tPostgresqlOutput_1;
if (ne_tPostgresqlOutput_1 != null) {
// build new exception to provide
// the original cause
sqle_tPostgresqlOutput_1 = new java.sql.SQLException(
e_tPostgresqlOutput_1
.getMessage()
+ "\ncaused by: "
+ ne_tPostgresqlOutput_1
.getMessage(),
ne_tPostgresqlOutput_1
.getSQLState(),
ne_tPostgresqlOutput_1
.getErrorCode(),
ne_tPostgresqlOutput_1);
errormessage_tPostgresqlOutput_1 = sqle_tPostgresqlOutput_1
.getMessage();
} else {
errormessage_tPostgresqlOutput_1 = e_tPostgresqlOutput_1
.getMessage();
}
int countSum_tPostgresqlOutput_1 = 0;
for (int countEach_tPostgresqlOutput_1 : e_tPostgresqlOutput_1
.getUpdateCounts()) {
countSum_tPostgresqlOutput_1 += (countEach_tPostgresqlOutput_1 < 0 ? 0
: countEach_tPostgresqlOutput_1);
}
insertedCount_tPostgresqlOutput_1 += countSum_tPostgresqlOutput_1;
System.err
.println(errormessage_tPostgresqlOutput_1);
}
}
....
}