Seven Stars

Apache Hadoop 2.7.5 distribution - Client verion mismatch

Hi

Im running a Apache Hadoop 2.7.5 distribution remotely.

Can't connect with error: Server IPC version 9 cannot communicate with client version 4
My hadoop is running a newer version compare to talend component.


1 - Possible to fix this by replacing jar files, if yes which jar files and where?
2 - Or possible to fix by setting custom properties in tHDFSConnection component, what about jar files?

3 - Or what config change on remote hadoop cluster, core-site.xml or ... ?

 

Regards

Jesper 


Windows 64bit.

TOS BigData : TOS_BD-20180116_1512-V6.5.1

Java : jdk1.8.0_161 + jre1.8.151

 

 

 

 

1 ACCEPTED SOLUTION

Accepted Solutions
Seven Stars

Re: Apache Hadoop 2.7.5 distribution - Client verion mismatch

Hi


Thanks, I've already found this documentation. But again I gave it another try, now instead of using Apache as starting point used Hortonworks.

 - Windows desktop with Talend Big Data.
 - Ubuntu server with hadoop 2.7.5 : Basic apache installation pseudo cluster setup and webhdfs enabled


tHDFSConnection: Custom configuration, based on Hortonworks 2.6.0

From my server I downloaded : /usr/local/bin/hadoop/share/hadoop/tools   all jar files.... (linux to windows 64bit)
I tried the Hortonworks 2.6.0 ... replaced three jar files: hadoop-*.jar (the other versions of jar files are similar to the ones of the apache hadoop distro in /tools/lib  )

"hdfs://192.168.1.123:9000/"

 

Tried to connect: 

- It was mentioning : can't locate : winutils.exe download from hortonworks winutils.exe , downloaded and placed in C:\hadoop\bin\winutils.exe

[ERROR]: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

Hadoop documentation on winutils: Hadoop documentation winutils.exe

 

- On windows set environment variable to HADOOP_HOME = C:\hadoop     

- Connected... yessuccess!

 

 

The anoying part is... hortonworks distro is working, but not something we prefer, there should be a proper configuration for apache distro's instead of this 1.0.0.
despite the lack of this apache configuration it works and fine by me...  

@xdshi : A couple of questions:
 - Why the need of winutils.exe? It just mimics unix stuff on a windows os assuming hadoop is installed on windows.... but Im connecting to a linux remote server ... is it a build-in ssh client or something?
- Is there an alternative way to connect to linux by only using java on a windows talend client, (I've used tRESTClient, cant PUT (upload) files posted: tRESTClient PUT upload file  )


Help is appreciated Smiley Happy

4 REPLIES
Moderator

Re: Apache Hadoop 2.7.5 distribution - Client verion mismatch

Hello,

In Talend Studio, if there is no support for the Hadoop distribution you want to use, this support may be available via an update.

TalendHelpCenter:Updating support for the most recent Hadoop distributions

Are you trying to connect to a custom Hadoop distribution?

Here is online document about:

TalendHelpCenter:Connecting to a custom Hadoop distribution

Best regards

Sabrina

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Seven Stars

Re: Apache Hadoop 2.7.5 distribution - Client verion mismatch

Hi


Thanks, I've already found this documentation. But again I gave it another try, now instead of using Apache as starting point used Hortonworks.

 - Windows desktop with Talend Big Data.
 - Ubuntu server with hadoop 2.7.5 : Basic apache installation pseudo cluster setup and webhdfs enabled


tHDFSConnection: Custom configuration, based on Hortonworks 2.6.0

From my server I downloaded : /usr/local/bin/hadoop/share/hadoop/tools   all jar files.... (linux to windows 64bit)
I tried the Hortonworks 2.6.0 ... replaced three jar files: hadoop-*.jar (the other versions of jar files are similar to the ones of the apache hadoop distro in /tools/lib  )

"hdfs://192.168.1.123:9000/"

 

Tried to connect: 

- It was mentioning : can't locate : winutils.exe download from hortonworks winutils.exe , downloaded and placed in C:\hadoop\bin\winutils.exe

[ERROR]: org.apache.hadoop.util.Shell - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

Hadoop documentation on winutils: Hadoop documentation winutils.exe

 

- On windows set environment variable to HADOOP_HOME = C:\hadoop     

- Connected... yessuccess!

 

 

The anoying part is... hortonworks distro is working, but not something we prefer, there should be a proper configuration for apache distro's instead of this 1.0.0.
despite the lack of this apache configuration it works and fine by me...  

@xdshi : A couple of questions:
 - Why the need of winutils.exe? It just mimics unix stuff on a windows os assuming hadoop is installed on windows.... but Im connecting to a linux remote server ... is it a build-in ssh client or something?
- Is there an alternative way to connect to linux by only using java on a windows talend client, (I've used tRESTClient, cant PUT (upload) files posted: tRESTClient PUT upload file  )


Help is appreciated Smiley Happy

Moderator

Re: Apache Hadoop 2.7.5 distribution - Client verion mismatch

Hello,

This issue is referenced at the following link : TalendHelpCenter:The missing winutils.exe program in the Big Data Jobs

Best regards

Sabrina

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Seven Stars

Re: Apache Hadoop 2.7.5 distribution - Client verion mismatch

@xdshi

 

I found this... but it doesn;'t make sense to me. This isn't a solution.

In my case: 

- Talend Develop ETL on windows, export jobs, schedule on linux (prod)... I use context vars to cover promotion.

 

Talend hdfs component should not determine on it's own if it is a hadoop on windows based on talend/os ... it should be a component checkbox (use winutils and overwrite hadoop_home or something).

So its a design/configuration issue. Is there a way to address this, it's not a bug, but its much bigger!