Employee

Launching the HIVE>Step_1_Hive_Load_Tables

Downloaded Hortonworks Sandbox 2.0 and Talend Open Studio for BigData 5.4
Installed everything with default
Imported the >>BigData Demo<< in TOS BD
Tried to launch the HIVE>Step_1_Hive_Load_Tables
Failed error message:
Exception in component tHDFSPut_1
java.lang.UnsupportedOperationException: Not implemented by the DistributedFileSystem FileSystem implementation
at org.apache.hadoop.fs.FileSystem.getScheme(FileSystem.java:213)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2401)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2411)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2428)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:156)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:153)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:153)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.tHDFSPut_1Process(Step_1_Hive_Load_Tables.java:622)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.runJobInTOS(Step_1_Hive_Load_Tables.java:2642)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.main(Step_1_Hive_Load_Tables.java:2456)
disconnected
What went wrong?

  • Big Data
10 REPLIES
Employee

Re: Launching the HIVE>Step_1_Hive_Load_Tables

Hi,
Did you change the distribution within the drop down list? Your HDP 2.0 is the GA or the Beta version?
Cheers,
Rémy.
Employee

Re: Launching the HIVE>Step_1_Hive_Load_Tables

I have switched to >>Hortonworks Data Platform V2.0.0(BigWheel)<< in tHDFSPut_1
Employee

Re: Launching the HIVE>Step_1_Hive_Load_Tables

I have downloaded the HDP 2.0 today - So it is the GA
Employee

Re: Launching the HIVE>Step_1_Hive_Load_Tables

Hi,
It seems you have a classpath conflict. The distribution must be the same for all the components in the designer. If your tHDFSPut uses HDP 2.0 (BigWheel) and the tHive components use the HDP 1.0, then you will meet such a problem.
So, please make sure to select the correct distribution for all the components.
Employee

Re: Launching the HIVE>Step_1_Hive_Load_Tables

switched also to >>Hortonworks Data Platform V2.0.0(BigWheel)<< in tHiveConnection_1
now TOS prompts:
: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300)
at org.apache.hadoop.util.Shell.(Shell.java:293)
at org.apache.hadoop.util.StringUtils.(StringUtils.java:76)
at org.apache.hadoop.conf.Configuration.getTrimmedStrings(Configuration.java:1546)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:519)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:453)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2433)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:156)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:153)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:153)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.tHDFSPut_1Process(Step_1_Hive_Load_Tables.java:622)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.runJobInTOS(Step_1_Hive_Load_Tables.java:2640)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.main(Step_1_Hive_Load_Tables.java:2454)
Exception in component tHDFSPut_1
java.net.ConnectException: Call From snlipp-PC/192.168.56.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
at org.apache.hadoop.ipc.Client.call(Client.java:1351)
at org.apache.hadoop.ipc.Client.call(Client.java:1300)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at $Proxy9.mkdirs(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at $Proxy9.mkdirs(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:467)
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2394)
at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2365)
at org.apache.hadoop.hdfs.DistributedFileSystem$16.doCall(DistributedFileSystem.java:817)
at org.apache.hadoop.hdfs.DistributedFileSystem$16.doCall(DistributedFileSystem.java:813)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:813)
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:806)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1933)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.tHDFSPut_1Process(Step_1_Hive_Load_Tables.java:631)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.runJobInTOS(Step_1_Hive_Load_Tables.java:2640)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.main(Step_1_Hive_Load_Tables.java:2454)
Caused by: java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:547)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:642)
at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399)
at org.apache.hadoop.ipc.Client.call(Client.java:1318)
... 22 more
disconnected
Employee

Re: Launching the HIVE>Step_1_Hive_Load_Tables

My environemnt is TOSBD 5.4.0 on Win7 SP1 64bit / Hortonworks Sandbox 2.0 running in Oracle VirtualBox on the same Win7 Machine (8GB RAM)
Employee

Re: Launching the HIVE>Step_1_Hive_Load_Tables

The first issue is not blocking. It's a hortonworks bug. They are aware of that and they will fix it in the next releases:
: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300)
at org.apache.hadoop.util.Shell.(Shell.java:293)
at org.apache.hadoop.util.StringUtils.(StringUtils.java:76)
at org.apache.hadoop.conf.Configuration.getTrimmedStrings(Configuration.java:1546)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:519)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:453)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2433)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:156)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:153)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:153)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.tHDFSPut_1Process(Step_1_Hive_Load_Tables.java:622)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.runJobInTOS(Step_1_Hive_Load_Tables.java:2640)
at bigdatademo.step_1_hive_load_tables_0_1.Step_1_Hive_Load_Tables.main(Step_1_Hive_Load_Tables.java:2454)
Employee

Re: Launching the HIVE>Step_1_Hive_Load_Tables

The second issue is a connection issue. What did you put in the tHDFSPut component as the Namenode URI?
Employee

Re: Launching the HIVE>Step_1_Hive_Load_Tables

Please read my message here: http://www.talendforge.org/forum/viewtopic.php?pid=120548#p120548
It will explain a HDP 2.0 bug to you.
Employee

Re: Launching the HIVE>Step_1_Hive_Load_Tables

I see. There seems misconfiguration in HDP 2.0. So I'll wait until the issue is fixed in HDP