One Star

tHDFSoutput error with HortonWorks Data Platform v1

I am using Talend
Version: 5.4.0RC1
Build id: r108625-20130926-0447
I get the following error
Starting job HortonWorks_first_MapReduce at 23:17 23/10/2013.

connecting to socket on port 3666
connected
Oct 23, 2013 11:18:56 PM org.apache.hadoop.util.NativeCodeLoader
WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Oct 23, 2013 11:19:17 PM org.apache.hadoop.hdfs.DFSClient$DFSOutputStream createBlockOutputStream
INFO: Exception in createBlockOutputStream 10.0.2.15:50010 java.net.ConnectException: Connection timed out: no further information
Oct 23, 2013 11:19:17 PM org.apache.hadoop.hdfs.DFSClient$DFSOutputStream nextBlockOutputStream
INFO: Abandoning block blk_4754674227949757591_1765
Oct 23, 2013 11:19:17 PM org.apache.hadoop.hdfs.DFSClient$DFSOutputStream nextBlockOutputStream
INFO: Excluding datanode 10.0.2.15:50010
Oct 23, 2013 11:19:17 PM org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer run
WARNING: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hue/subhaji1.txt could only be replicated to 0 nodes, instead of 1
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1983)
at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:785)
at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1444)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1440)
at java.security.AccessController.doPrivileged(Native Method)
disconnected
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1438)
at org.apache.hadoop.ipc.Client.call(Client.java:1066)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at com.sun.proxy.$Proxy1.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at com.sun.proxy.$Proxy1.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3507)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3370)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2700(DFSClient.java:2586)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2826)
Oct 23, 2013 11:19:17 PM org.apache.hadoop.hdfs.DFSClient$DFSOutputStream processDatanodeError
WARNING: Error Recovery for block blk_4754674227949757591_1765 bad datanode nodes == null
Oct 23, 2013 11:19:17 PM org.apache.hadoop.hdfs.DFSClient$DFSOutputStream processDatanodeError
WARNING: Could not get block locations. Source file "/user/hue/subhaji1.txt" - Aborting...
Exception in component tHDFSOutput_1
org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hue/subhaji1.txt could only be replicated to 0 nodes, instead of 1
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1983)
at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:785)
at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:587)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1444)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1440)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1438)
at org.apache.hadoop.ipc.Client.call(Client.java:1066)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at com.sun.proxy.$Proxy1.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at com.sun.proxy.$Proxy1.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3507)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3370)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2700(DFSClient.java:2586)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2826)
Job HortonWorks_first_MapReduce ended at 23:19 23/10/2013.

What am I doing wrong?
I am using this doc as a reference to configure the talend pipeline/map
http://hortonworks.com/kb/how-to-connectwrite-a-file-to-hortonworks-sandbox-from-talend-studio/

  • Big Data
14 REPLIES
Moderator

Re: tHDFSoutput error with HortonWorks Data Platform v1

Hi,
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer run
WARNING: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hue/subhaji1.txt could only be replicated to 0 nodes, instead of 1

It seems hard disk space is insufficient. Could you please check that:
1. Is there any space in the system or hdfs.
2. Is datanode working
3. If it is in Safe Mode
4. The permissions of reading and writing.
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: tHDFSoutput error with HortonWorks Data Platform v1

Sabrina,
I file which I am trying to move is a 10KB file. I can manually upload the doc using hue
See the attached screenshot on the health of the hadoop vm and also the file uploaded manually
Please review the url from hortonworks, lots of others folks have the same issue
http://hortonworks.com/community/forums/topic/issue-connecting-talend-studio-with-sandbox/
Let me know what I should do next.
Thanks
Subhajit
Moderator

Re: tHDFSoutput error with HortonWorks Data Platform v1

Hi,
You have problems with HDFS. It is best to consult to HDFS expert/administrator to get a workaround for your issue.
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
One Star

Re: tHDFSoutput error with HortonWorks Data Platform v1

I just installed Hortonworks Sandbox 2.0, now Talend will not communicate at all. Atleast earlier we Sandbox 1.0, it will show the file structure, now it doesn't show anything at all.
Employee

Re: tHDFSoutput error with HortonWorks Data Platform v1

Hi,
Can you please browse your namenode on the port 50070 (http://hostname:50070) and let us know if there is a data node which is alive ?
Cheers,
Rémy.
One Star

Re: tHDFSoutput error with HortonWorks Data Platform v1

Attached are the screen shots of a working env
Employee

Re: tHDFSoutput error with HortonWorks Data Platform v1

I'm not sure this is the cause of your issue but in any case, I just see that you are using Talend 5.4.0 RC1 with HDP 2.0 GA and they are not compatible together.
Our 5.4.0 RC1 is compatible with the HDP 2.0 Beta and our 5.4.0 GA is compatible with the HDP 2.0 GA. Please upgrade your Talend studio and try again. If the same problem occurs, please post again here.
Cheers,
Rémy.
One Star

Re: tHDFSoutput error with HortonWorks Data Platform v1

Remy,
Is 5.4.0 GA availabe for download?
Subhajit
Employee

Re: tHDFSoutput error with HortonWorks Data Platform v1

Yes on our website.
One Star

Re: tHDFSoutput error with HortonWorks Data Platform v1

Will the first error also get resolved with the latest 5.4 software?
Employee

Re: tHDFSoutput error with HortonWorks Data Platform v1

What's the first error?
One Star

Re: tHDFSoutput error with HortonWorks Data Platform v1

I get this error when I am using the tHDFSGet component with Talend 5.4GA and Sandbox 2.0
Starting job FileCopy at 12:55 31/10/2013.

connecting to socket on port 3710
connected
: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300)
at org.apache.hadoop.util.Shell.(Shell.java:293)
at org.apache.hadoop.util.StringUtils.(StringUtils.java:76)
at org.apache.hadoop.conf.Configuration.getTrimmedStrings(Configuration.java:1546)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:519)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:453)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2433)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:156)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:153)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:153)
at hadoop.filecopy_0_1.FileCopy.tHDFSGet_1Process(FileCopy.java:292)
at hadoop.filecopy_0_1.FileCopy.runJobInTOS(FileCopy.java:721)
at hadoop.filecopy_0_1.FileCopy.main(FileCopy.java:586)
disconnected
Job FileCopy ended at 12:55 31/10/2013.
One Star

Re: tHDFSoutput error with HortonWorks Data Platform v1

If I use tHDFSCopy component, I get the following error
===
Starting job FileCopy at 14:50 31/10/2013.

connecting to socket on port 3482
connected
: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300)
at org.apache.hadoop.util.Shell.(Shell.java:293)
at org.apache.hadoop.util.StringUtils.(StringUtils.java:76)
at org.apache.hadoop.conf.Configuration.getTrimmedStrings(Configuration.java:1546)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:519)
at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:453)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:136)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2433)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:156)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:153)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:153)
at hadoop.filecopy_0_1.FileCopy.tHDFSCopy_1Process(FileCopy.java:301)
at hadoop.filecopy_0_1.FileCopy.runJobInTOS(FileCopy.java:543)
at hadoop.filecopy_0_1.FileCopy.main(FileCopy.java:408)
: org.apache.hadoop.hdfs.DFSClient - Failed to connect to /10.0.2.15:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection timed out: no further information
java.net.ConnectException: Connection timed out: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
at org.apache.hadoop.hdfs.DFSInputStream.newTcpPeer(DFSInputStream.java:955)
at org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:1107)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:533)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:749)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:793)
at java.io.DataInputStream.read(Unknown Source)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:78)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:52)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:112)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
at hadoop.filecopy_0_1.FileCopy.tHDFSCopy_1Process(FileCopy.java:316)
at hadoop.filecopy_0_1.FileCopy.runJobInTOS(FileCopy.java:543)
at hadoop.filecopy_0_1.FileCopy.main(FileCopy.java:408)
: org.apache.hadoop.hdfs.DFSClient - Could not obtain BP-1578958328-10.0.2.15-1382306880516:blk_1073742528_1707 from any node: java.io.IOException: No live nodes contain current block. Will get new block locations from namenode and retry...
: org.apache.hadoop.hdfs.DFSClient - DFS chooseDataNode: got # 1 IOException, will wait for 2008.527916396605 msec.
: org.apache.hadoop.hdfs.DFSClient - Failed to connect to /10.0.2.15:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection timed out: no further information
java.net.ConnectException: Connection timed out: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
at org.apache.hadoop.hdfs.DFSInputStream.newTcpPeer(DFSInputStream.java:955)
at org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:1107)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:533)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:749)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:793)
at java.io.DataInputStream.read(Unknown Source)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:78)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:52)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:112)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
at hadoop.filecopy_0_1.FileCopy.tHDFSCopy_1Process(FileCopy.java:316)
at hadoop.filecopy_0_1.FileCopy.runJobInTOS(FileCopy.java:543)
at hadoop.filecopy_0_1.FileCopy.main(FileCopy.java:408)
: org.apache.hadoop.hdfs.DFSClient - Could not obtain BP-1578958328-10.0.2.15-1382306880516:blk_1073742528_1707 from any node: java.io.IOException: No live nodes contain current block. Will get new block locations from namenode and retry...
: org.apache.hadoop.hdfs.DFSClient - DFS chooseDataNode: got # 2 IOException, will wait for 6936.148655302311 msec.
: org.apache.hadoop.hdfs.DFSClient - Failed to connect to /10.0.2.15:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection timed out: no further information
java.net.ConnectException: Connection timed out: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
at org.apache.hadoop.hdfs.DFSInputStream.newTcpPeer(DFSInputStream.java:955)
at org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:1107)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:533)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:749)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:793)
at java.io.DataInputStream.read(Unknown Source)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:78)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:52)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:112)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
at hadoop.filecopy_0_1.FileCopy.tHDFSCopy_1Process(FileCopy.java:316)
at hadoop.filecopy_0_1.FileCopy.runJobInTOS(FileCopy.java:543)
at hadoop.filecopy_0_1.FileCopy.main(FileCopy.java:408)
: org.apache.hadoop.hdfs.DFSClient - Could not obtain BP-1578958328-10.0.2.15-1382306880516:blk_1073742528_1707 from any node: java.io.IOException: No live nodes contain current block. Will get new block locations from namenode and retry...
: org.apache.hadoop.hdfs.DFSClient - DFS chooseDataNode: got # 3 IOException, will wait for 7523.5380497349015 msec.
: org.apache.hadoop.hdfs.DFSClient - Failed to connect to /10.0.2.15:50010 for block, add to deadNodes and continue. java.net.ConnectException: Connection timed out: no further information
java.net.ConnectException: Connection timed out: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
at org.apache.hadoop.hdfs.DFSInputStream.newTcpPeer(DFSInputStream.java:955)
at org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:1107)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:533)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:749)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:793)
at java.io.DataInputStream.read(Unknown Source)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:78)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:52)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:112)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
at hadoop.filecopy_0_1.FileCopy.tHDFSCopy_1Process(FileCopy.java:316)
at hadoop.filecopy_0_1.FileCopy.runJobInTOS(FileCopy.java:543)
at hadoop.filecopy_0_1.FileCopy.main(FileCopy.java:408)
: org.apache.hadoop.hdfs.DFSClient - DFS Read
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1578958328-10.0.2.15-1382306880516:blk_1073742528_1707 file=/user/hue/jobsub/sample_data/sonnets.txt
at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:838)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:526)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:749)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:793)
at java.io.DataInputStream.read(Unknown Source)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:78)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:52)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:112)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
at hadoop.filecopy_0_1.FileCopy.tHDFSCopy_1Process(FileCopy.java:316)
at hadoop.filecopy_0_1.FileCopy.runJobInTOS(FileCopy.java:543)
at hadoop.filecopy_0_1.FileCopy.main(FileCopy.java:408)
Exception in component tHDFSCopy_1
org.apache.hadoop.hdfs.BlockMissingException: Could not obtain block: BP-1578958328-10.0.2.15-1382306880516:blk_1073742528_1707 file=/user/hue/jobsub/sample_data/sonnets.txt
at org.apache.hadoop.hdfs.DFSInputStream.chooseDataNode(DFSInputStream.java:838)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:526)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:749)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:793)
at java.io.DataInputStream.read(Unknown Source)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:78)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:52)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:112)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:366)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
at hadoop.filecopy_0_1.FileCopy.tHDFSCopy_1Process(FileCopy.java:316)
at hadoop.filecopy_0_1.FileCopy.runJobInTOS(FileCopy.java:543)
at hadoop.filecopy_0_1.FileCopy.main(FileCopy.java:408)
disconnected
Job FileCopy ended at 14:52 31/10/2013.
====
Employee

Re: tHDFSoutput error with HortonWorks Data Platform v1

Hi,
The first error (Could not locate executable null\bin\winutils.exe in the Hadoop binaries.) is a HDP 2.0 issue.
The second one is a cluster problem. It seems that your data node are not available or joinable by the client.