One Star

sqoop import

Hi ,
I am using talend-bigdata version5.6 and trying to import data from mysql to hadoop using sqoop .
I am getting the below error
: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies.
I have uploaded screen shot also with this.
Please help me .
Thanks

8 REPLIES
Moderator

Re: sqoop import

Hi,
Thank you for your post! We can't see the screenshot on our side. Could you check it, please?
Make sure your screenshot is not bigger than 2MB and screenshots works only if you drag&drop the image directly in the editor window.
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: sqoop import

Hi PFA screen shot 
In case if  screen shot is not available,I have copied down error as well , 
at project1.hg_0_1.hg.main(hg.java:999)
: org.apache.sqoop.Sqoop - Running Sqoop version: 1.4.4-cdh5.0.3
: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
: org.apache.sqoop.manager.MySQLManager - Preparing to use a MySQL streaming resultset.
: org.apache.sqoop.tool.CodeGenTool - Beginning code generation
: org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `cities` AS t LIMIT 1
: org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `cities` AS t LIMIT 1
: org.apache.sqoop.orm.CompilationManager - $HADOOP_MAPRED_HOME is not set
Note: \tmp\sqoop-307425\compile\282dc771b9ac8f2a1793016ce1cc564a\cities.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
: org.apache.sqoop.orm.CompilationManager - Writing jar file: \tmp\sqoop-307425\compile\282dc771b9ac8f2a1793016ce1cc564a\cities.jar
: org.apache.sqoop.manager.DirectMySQLManager - Beginning mysqldump fast path import
: org.apache.sqoop.mapreduce.ImportJobBase - Beginning import of cities
: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies.
: org.apache.sqoop.tool.ImportTool - Imported Failed: java.net.UnknownHostException: aster2
Exception in component tSqoopImport_1
Moderator

Re: sqoop import

Hi,
Would you mind uploading your tSqoopImport component setting screenshot into forum?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: sqoop import

Hi  Please find the screen shot.
I have dragged and dropped it in the editor window. but I am not sure if my screen shot is uploaded.

Regards,
Deepa
Moderator

Re: sqoop import

Hi,
It seems that your screenshot is missing. Make sure your screenshot is not bigger than 2MB.
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: sqoop import

Hi, I'm facing similar issues for sqoop import.
I'm trying Step_2_SQOOP_MYSQL_TO_HDFS_Imports on TOS-BD-5.6.1.20141207_1530 on MACOSX
And have hadoop sandbox (HDP2.2) and MySQL in different IP address.
Step_1 of the Demo job have done successfully so looks Mysql side configured properly. Then gotten error in Step 2 when connecting to hadoop side.
I set up hadoop distribution for tSqoopImport into "custom" using add-on download from talend exchange
Hortonworks Data Platform 2.2.X
 
The error was like this:
 connecting to socket on port 4062
connected
: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2015-07-09 09:30:34.238 java Unable to load realm info from SCDynamicStore
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
: org.apache.sqoop.Sqoop - Running Sqoop version: 1.4.5.2.2.4.2-2
: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
: org.apache.sqoop.manager.SqlManager - Using default fetchSize of 1000
: org.apache.sqoop.tool.CodeGenTool - Beginning code generation
: org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `point` AS t LIMIT 1
: org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `point` AS t LIMIT 1
: org.apache.sqoop.orm.CompilationManager - $HADOOP_MAPRED_HOME is not set
Note: /tmp/sqoop-XXX/compile/XXXX/table_sample.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
: org.apache.sqoop.orm.CompilationManager - Writing jar file: /tmp/sqoop-XXXXXXX/compile/XXXXXXX/table_sample.jar
: org.apache.sqoop.manager.MySQLManager - It looks like you are importing from mysql.
: org.apache.sqoop.manager.MySQLManager - This transfer can be faster! Use the --direct
: org.apache.sqoop.manager.MySQLManager - option to exercise a MySQL-specific fast path.
: org.apache.sqoop.manager.MySQLManager - Setting zero DATETIME behavior to convertToNull (mysql)
: org.apache.sqoop.manager.CatalogQueryManager - The table table_sample contains a multi-column primary key. Sqoop will default to the column x01_navi_id only for this job.
: org.apache.sqoop.manager.CatalogQueryManager - The table table_sample a multi-column primary key. Sqoop will default to the column column_a only for this job.
: org.apache.sqoop.mapreduce.ImportJobBase - Beginning import of point
: org.apache.hadoop.conf.Configuration.deprecation - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
: org.apache.hadoop.conf.Configuration.deprecation - mapred.jar is deprecated. Instead, use mapreduce.job.jar
: org.apache.hadoop.conf.Configuration.deprecation - mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies.
: org.apache.hadoop.conf.Configuration.deprecation - session.id is deprecated. Instead, use dfs.metrics.session-id
: org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=
: org.apache.hadoop.mapreduce.JobSubmitter - Cleaning up the staging area file:/tmp/hadoop-XXXX/mapred/staging/hdfs2104262709/.staging/job_local2104262709_0001
Exception in component tSqoopImport_4
java.lang.Exception: The Sqoop import job has failed. Please check the logs.
at bigdatademo.step_2_sqoop_mysql_to_hdfs_import_simple_0_1.Step_2_SQOOP_MYSQL_TO_HDFS_Import_simple.tSqoopImport_4Process(Step_2_SQOOP_MYSQL_TO_HDFS_Import_simple.java:526)
at bigdatademo.step_2_sqoop_mysql_to_hdfs_import_simple_0_1.Step_2_SQOOP_MYSQL_TO_HDFS_Import_simple$2.run(Step_2_SQOOP_MYSQL_TO_HDFS_Import_simple.java:912)
: org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.FileNotFoundException: File does not exist: hdfs://namenode_ip:8020/Applications/TOS_BD-20141207_1530-V5.6.1/workspace/.Java/lib/mysql-connector-java.jar
at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122)
at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)

Error seems the job trying to read JDBC driver (mysql-connector-java.jar) from hdfs with inappropriate path.
(Says my local machine (mac)'s local path like /Application/... ). How can I fix this?
please find the screenshot (Sorry for the locale, I'm using JP):
Best,
Daisuke
One Star

Re: sqoop import

Hi @deepatalend,
Was your issue solved? i am facing same problem?
can you please let me know what was done?
thanks,
RIshit Shah
One Star

Re: sqoop import

Hi I am using talend-bigdata version 6.1.2 and trying to import data from sql server to hadoop using tsqoopimport.
getting error like : - Note: \tmp\sqoop-307425\compile\282dc771b9ac8f2a1793016ce1cc564a\cities.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.