Hi, I have a job which puts (using thdfsoutput) some data to a remote server hdfs. The job works correctly using the standard "Run" tab. No connection problems. When I run the job using oozie I receive the error:
The local file can not upload to Hadoop HDFS!
Unable to upload file C:\...\TOS_BD-20161026_1219-V6.3.0\temp\Twitter\lib\hadoop-hdfs-188.8.131.52.4.0.0-169.jar to path/to/hdfs What I see in the hdfs directory where the job is being deployed is a bunch of 0-byte jar files... Any idea of what can be the reason? locally I use jre 1.8, the server OS is running jre 1.7, can this be the problem? (more info: I can successfully put this jar file to hdfs via Hue.) Thank you!
Re: [resolved] Oozie cannot upload jar files to HDFS.
errorlog.txt Hi Sabrina, thank you very much for the answer. After some pain I resolved the java version and now jdk1.8 are installed in all the nodes. Unfortunately I continue with the same error when I try to run the job from the oozie view. Any idea? Something I should have done earlier is to click the "check services" button from the edit cluster metadata prompt. Namenode is OK but Resource Manager fails. Hostnames and ports are coherent with the cluster configuration. Does anyone have an idea of what can be causing this error? I attach the error in txt file. It seems to be this error: , but I kerberos security is not enabled in the cluster and I have not ticked this checkbox mentioned in the link. Please help! Thanks! errorlog.txt