Error with Talend 6.0.1 and HDP 2.2

Four Stars

Error with Talend 6.0.1 and HDP 2.2

Hi ,
I am connecting to HDP 2.2 using talend and is trying to design a job but its giving me following error:
Starting job mrjobexample at 13:07 08/01/2016.
connecting to socket on port 3432
connected
: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:355)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:370)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:363)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:104)
at org.apache.hadoop.security.Groups.<init>(Groups.java:86)
at org.apache.hadoop.security.Groups.<init>(Groups.java:66)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:280)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:271)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:248)
at org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:325)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:319)
at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:567)
at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:420)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:316)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:601)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:159)
at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:156)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Unknown Source)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:156)
at bigdata_demo.mrjobexample_0_1.mrjobexample.tHDFSInput_1Process(mrjobexample.java:1515)
at bigdata_demo.mrjobexample_0_1.mrjobexample.tHDFSConnection_1Process(mrjobexample.java:395)
at bigdata_demo.mrjobexample_0_1.mrjobexample.runJobInTOS(mrjobexample.java:2220)
at bigdata_demo.mrjobexample_0_1.mrjobexample.main(mrjobexample.java:2077)
Exception in component tHDFSInput_1
java.nio.channels.UnresolvedAddressException
at sun.nio.ch.Net.checkAddress(Unknown Source)
at sun.nio.ch.SocketChannelImpl.connect(Unknown Source)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
at org.apache.hadoop.hdfs.DFSClient.newConnectedPeer(DFSClient.java:3101)
at org.apache.hadoop.hdfs.BlockReaderFactory.nextTcpPeer(BlockReaderFactory.java:755)
at org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockReaderFactory.java:670)
at org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:337)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:576)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:800)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:847)
at java.io.DataInputStream.read(Unknown Source)
at java.io.FilterInputStream.read(Unknown Source)
at java.io.PushbackInputStream.read(Unknown Source)
at org.talend.fileprocess.UnicodeReader.<init>(UnicodeReader.java:25)
at org.talend.fileprocess.TOSDelimitedReader.<init>(TOSDelimitedReader.java:77)
at org.talend.fileprocess.FileInputDelimited.<init>(FileInputDelimited.java:93)
at bigdata_demo.mrjobexample_0_1.mrjobexample.tHDFSInput_1Process(mrjobexample.java:1566)
at bigdata_demo.mrjobexample_0_1.mrjobexample.tHDFSConnection_1Process(mrjobexample.java:395)
at bigdata_demo.mrjobexample_0_1.mrjobexample.runJobInTOS(mrjobexample.java:2220)
at bigdata_demo.mrjobexample_0_1.mrjobexample.main(mrjobexample.java:2077)
disconnected
Job mrjobexample ended at 13:07 08/01/2016.
What i am doing with job is : 
Hdfsinput --> tfilterrow --> taggregaterow --> thdfsoutput
What i have observed is , talend 6.0.1 and hdp 2.2/2.1/2.3 are not working and is giving same error.
Help appreciated.
Thanks,
Saurabh.
Moderator

Re: Error with Talend 6.0.1 and HDP 2.2

Hi,
Are you using Window OS? Here is a related jira issue:https://jira.talendforge.org/browse/TBD-460
Have you already checked a KB article about:TalendHelpCenter:The missing winutils.exe program in the Big Data Jobs?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.