Initial setting to connect to remote hadoop system

One Star

Initial setting to connect to remote hadoop system

TOS 5.1.2.r90681 is installed on my local system (WindowsXP OS), I need to connect to a remote server where the hdfs system is located(Ubuntu OS) . I need to put a file from my local to the remote server . I have used the thdfsput component and have provided the correct host IP address as well as the port number. But for some reason have encountered this error:
xception in component tHDFSPut_1
java.io.IOException: Call to /172.20.221.99:50070 failed on local exception: java.io.EOFException
at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
at org.apache.hadoop.ipc.Client.call(Client.java:743)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at $Proxy0.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
at projnewtest.hdfs_job_0_1.HDFS_JOB.tFileInputDelimited_1Process(HDFS_JOB.java:425)
at projnewtest.hdfs_job_0_1.HDFS_JOB.runJobInTOS(HDFS_JOB.java:894)
at projnewtest.hdfs_job_0_1.HDFS_JOB.main(HDFS_JOB.java:762)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(Unknown Source)
at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:508)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
I have tried pinging the remote server , it works fine and am able to run commands using tSSH component, but am not able to use any of the hdfs components.All the hadoop processes are up and running on the server.
Is there some initial settings that I have missed out????
Regards,
Nayan.
Community Manager

Re: Initial setting to connect to remote hadoop system

Hi
First, make sure you are able to connect to remote HDFS system with your ip and port. Execute the following command from CMD and see if it works.
dSmiley Embarassedtelnet ip port
I tested to put a file to HDFS with tHDFSPut in v.5.2.0 and it works, for detailed settings, please see my screenshots.
Shong
----------------------------------------------------------
Talend | Data Agility for Modern Business
Employee

Re: Initial setting to connect to remote hadoop system

Hi,
The port you are using is wrong. You are using the web port. The port for a namenode is 8020 or 9000 usually.
In order to get this port number, you can open the file core-site.xml in your hadoop configuration folder, and find the key "fs.default.name". You will get the namenode URI to put in your component.
I hope it helps,
Regards,
Rémy.