Four Stars

Oozie and kerberos Hive

Hi there


I currently have kerberos hortonworks 2.5.0 cluster and I am using Talend 6.4.1 for Big Data, I am struggling to get oozie to run a tHiveRow component. I can run hdfs components fine through the oozie, but it seems there is an authentication issue i cant get around when connecting to hive. I can run the tHiveRrow through talend normally but I cant schedule it. It complains that it cant authenticate the user with the keytab file.


I did notice the workflow.xml that is being generated is missing the credentials tags. How can I insert these in? Specificaly

<credential name='hive_auth' type='hcat'?




The Thiverow component is a simple truncate table as a test.


Your assistance would be appreciated.



  • Big Data
Tags (2)

Re: Oozie and kerberos Hive


Have you already defined the HDFS connection details in the Oozie scheduler view, and specify the path where your Job will be deployed? Would you mind posting your setting screenshots into forum which will be helpful for us to understand your current issue?

Here is online document about:TalendHelpCenter:How to run a Job via Oozie

Best regards


Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Four Stars

Re: Oozie and kerberos Hive

Hi Sabrina


Thank you for replying. I can execute hdfs jobs fine through oozie, its just the hive component that is giving me kerberos errors. Below is the Job, I initially tried with just tHiveRow. 




I can manually run the above fine in talend. When I try in oozie, the job runs but fails with the error below occurs, it seems the java action in oozie cannot get a kerberos ticket for hive but can for HDFS tasks. I am running this off the same machine that oozie is installed on, so finding the keyfile shouldn't be a problem. 


SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/disks/disk11/hadoop/yarn/local/filecache/1564/mapreduce.tar.gz/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/disks/disk13/hadoop/yarn/local/filecache/1602/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Exception in component tHiveConnection_1 (oozietest1)
java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://hivehost:10000/default;principal=xxx/ GSS initiate failed
	at org.apache.hive.jdbc.HiveConnection.openTransport(

After alot of investigation, I read somewhere we need to include a jaas file that oozie needs to execute hive. How can I include a jaas.conf file in the job?