Problem with tHdfsConnection

Six Stars

Problem with tHdfsConnection

Hello everyone,
I try to use talend big data but fail to use it properly.
I have the following error when I'm lauching the job in attachment (job_launched.JPG)
 connecting to socket on port 3673
connected
DAL_Extrateur_Segment_Prime_Individuelle_v2 - Test du début d'un job
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
: org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory - The short-circuit local reads feature cannot be used because UNIX Domain sockets are not available on Windows.
: gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2 - tHDFSPut_1 Call From GM64XXX/XXX.XXX.XX.X to myhbaseserv:50070 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
Exception in component tHDFSPut_1
java.net.ConnectException: Call From GM64XXX/XXX.XXX.XX.X to myhbaseserv99:50070 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
 at java.lang.reflect.Constructor.newInstance(Unknown Source)
 at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
 at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
 at org.apache.hadoop.ipc.Client.call(Client.java:1431)
 at org.apache.hadoop.ipc.Client.call(Client.java:1358)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
 at com.sun.proxy.$Proxy8.mkdirs(Unknown Source)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:558)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
 at java.lang.reflect.Method.invoke(Unknown Source)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
 at com.sun.proxy.$Proxy9.mkdirs(Unknown Source)
 at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3008)
 at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2978)
 at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047)
 at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043)
 at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1043)
 at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036)
 at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1877)
 at gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2.tFileList_1Process(DAL_Extrateur_Segment_Prime_Individuelle_v2.java:671)
 at gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2.tOracleInput_1Process(DAL_Extrateur_Segment_Prime_Individuelle_v2.java:3180)
 at gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2.tRunJob_1Process(DAL_Extrateur_Segment_Prime_Individuelle_v2.java:3456)
 at gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2.runJobInTOS(DAL_Extrateur_Segment_Prime_Individuelle_v2.java:3705)
 at gouvernance_bd.dal_extrateur_segment_prime_individuelle_v2_0_1.DAL_Extrateur_Segment_Prime_Individuelle_v2.main(DAL_Extrateur_Segment_Prime_Individuelle_v2.java:3539)
Caused by: java.net.ConnectException: Connection refused: no further information
 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
 at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
 at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
 at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
 at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:612)
 at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:710)
 at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:373)
 at org.apache.hadoop.ipc.Client.getConnection(Client.java:1493)
 at org.apache.hadoop.ipc.Client.call(Client.java:1397)
 ... 24 more
disconnected
Job DAL_Extrateur_Segment_Prime_Individuelle_v2 ended at 08:48 01/06/2016.

On my thdfsconnection, I have :
Distribution : Hortonworks 2.3.0
Namenode URI : hdfs://myhbaseserv:50070 (==> which is the port of my dfs.namenode.http-address variable in the hdfs-site.xml)
User name : hdfs (which is the dfs.cluster.administrators variable in my hdfs-site.xml)
Is that ok ?
I see on a lot of samples that the port 8020 is used for the Namenode URI.
Thanks a lot for your help !


                 
                                                          
 
Six Stars

Re: Problem with tHdfsConnection

Neither 8020 or any others ports can be access by my composant.
If you have any advice ... Thanks

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Put Massive Amounts of Data to Work

Learn how to make your data more available, reduce costs and cut your build time

Watch Now

How OTTO Utilizes Big Data to Deliver Personalized Experiences

Read about OTTO's experiences with Big Data and Personalized Experiences

Blog

Talend Integration with Databricks

Take a look at this video about Talend Integration with Databricks

Watch Now