Five Stars

tSqoopImport Error while loading data from RDBMS to HDFS

Hello folks,

      I created standard job for loadind data from Oracle to Hdfs in parquet file format. I am using Talend Real-time big data platform 6.5.1 and MapR distribution 5.2.0. My MapR is installed in other machine(server). While running the I got the error like below.

 

[INFO ]: ubpaponus.hive_job123_0_1.hive_job123 - TalendJob: 'hive_job123' - Start.
[statistics] connecting to socket on port 3816
[statistics] connected
[INFO ]: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
[INFO ]: org.apache.sqoop.Sqoop - Running Sqoop version: 1.4.6-mapr-1601
[WARN ]: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
[WARN ]: org.apache.sqoop.ConnFactory - Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --
connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be
used next time.
[INFO ]: org.apache.sqoop.manager.SqlManager - Using default fetchSize of 1000
[INFO ]: org.apache.sqoop.tool.CodeGenTool - Beginning code generation
[INFO ]: org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM CP_APONUS AS t WHERE 1=0
[ERROR]: org.apache.sqoop.manager.SqlManager - Error executing statement: java.sql.SQLSyntaxErrorException: ORA-00933: SQL command not properly ended

java.sql.SQLSyntaxErrorException: ORA-00933: SQL command not properly ended

at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:450)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:399)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1017)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:655)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:249)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:566)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:215)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:58)
at oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:776)
at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:897)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1034)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3820)
at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3867)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1502)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:758)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:767)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:270)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1833)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:606)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at ubpaponus.hive_job123_0_1.hive_job123.tSqoopImport_1Process(hive_job123.java:546)
at ubpaponus.hive_job123_0_1.hive_job123.runJobInTOS(hive_job123.java:862)
at ubpaponus.hive_job123_0_1.hive_job123.main(hive_job123.java:688)
[ERROR]: org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:606)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at ubpaponus.hive_job123_0_1.hive_job123.tSqoopImport_1Process(hive_job123.java:546)
at ubpaponus.hive_job123_0_1.hive_job123.runJobInTOS(hive_job123.java:862)
at ubpaponus.hive_job123_0_1.hive_job123.main(hive_job123.java:688)
Exception in component tSqoopImport_1 (hive_job123)
java.lang.Exception: The Sqoop import job has failed. Please check the logs.
at ubpaponus.hive_job123_0_1.hive_job123.tSqoopImport_1Process(hive_job123.java:550)
at ubpaponus.hive_job123_0_1.hive_job123.runJobInTOS(hive_job123.java:862)
at ubpaponus.hive_job123_0_1.hive_job123.main(hive_job123.java:688)
[FATAL]: ubpaponus.hive_job123_0_1.hive_job123 - tSqoopImport_1 The Sqoop import job has failed. Please check the logs.
java.lang.Exception: The Sqoop import job has failed. Please check the logs.
at ubpaponus.hive_job123_0_1.hive_job123.tSqoopImport_1Process(hive_job123.java:550)
at ubpaponus.hive_job123_0_1.hive_job123.runJobInTOS(hive_job123.java:862)
at ubpaponus.hive_job123_0_1.hive_job123.main(hive_job123.java:688)
[statistics] disconnected

Job hive_job123 ended at 02:14 09/05/2018. [exit code=1]

 

Kindly suggest me how to resolve this.

 

Regards,

Rupesh.M

1 REPLY
Moderator

Re: tSqoopImport Error while loading data from RDBMS to HDFS

Hello,

We have a jira issue on talend bug tracker about your issue. This issue has been fixed on 7.0.1, 6.5.2

Could you please create a case on talend support portal so that our colleagues from support team will help you?

Best regards

Sabrina

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.