One Star

Hadoop Map-Reduce with Cloudera-VM unknown host error

I am using Talend Enterprise(5.5.1) for BigData and running WordCount mapreduce job.My hadoop is installed in ClouderaVM.
When running it shows me error UnknowHostException error :UnknownHost :cloudera-vm and when i check my /etc/hosts file in ubuntu it shows me line this cloudera-vm localhost.localdomain localdomain cloudera-vm
My NameNode is up and running at port 8020 and JobTracker at 8021( Check using netstat -nl|grep 8020)
Please help me to resolve it.
One Star

Re: Hadoop Map-Reduce with Cloudera-VM unknown host error

Hi Sanjay,
 This seems to be an issue with connection to the VM. Can you explain the job you are trying to run with the components you are using? I have done a similar job with the Hortonworks VM where I have read data from hdfs and written it back into hdfs using Talend. For the Namenode URI, have you mentioned the IP along with the port 8020? I had used the notation as "hdfs://".
One Star

Re: Hadoop Map-Reduce with Cloudera-VM unknown host error

Hi ganandg
I have specified in NameNode URI same as you have explained that is 
"hdfs://".My Job is running well when i am using HDFS components as you said,but problem arises when i try to use Hive component.To resolve host name i added my host name in C:\Windows\System32\drivers\etc\hosts and it worked well.But my main has not not resolve that is my Hive is always liestening to loopback address(
My hadoop conf file's configuration is as :
1) Core-site.xml :- hdfs:cloudera-vm:8020
2) Mapred-site.xml:-cloudera-vm:8021
If i use localhost instead of hostname,hive works but when used hostname or ip address in configuration file,it doesn't works as it is continuosly lietening at localhost//
Please help me if you have gone through this,After lot of searches i came to know to know problem is somewhere in /etc/hosts file and it is regarding qualified domain name,i tried a lot but faced the same problem
With Regards 
Sanjay BD