Four Stars

tHiveInput is not working in batch jobs (spark) in a kerberos cluster.

we are trying to use talend batch (spark) jobs to access hive in a Kerberos cluster but we are getting the below "Can't get Master Kerberos principal for use as renewer" error.

 

By using the standard jobs(non spark) in talend we are able to access hive without any issue.

 

Sample Batch Job:

 

talend_issues.PNG

Below are the observation:

 

  1. When we are running spark jobs talend could able to connect to hive metastore and validating the syntax. ex if I provide the wrong table name it does return "table not found".
  2. when we select count(*) from table where there is no data it returns "NULL" but if some data present in Hdfs(table) It failed with the error "Can't get Master Kerberos principal for use as renewer".

I am not sure exactly what is the issue which is causing the token problem. could some one help us know the root cause.

One more thing to add instead of hive if I read / write to hdfs using spark batch jobs it works , So only problem is with hive and Kerberos.

  • Big Data
Tags (1)
3 REPLIES
Moderator

Re: tHiveInput is not working in batch jobs (spark) in a kerberos cluster.

Hello,

The error says that you try to access a kerberized resource with a unsecured client configuration.

In the batch job, did you select the kerberos configuration in the tHDFSConfiguration?
Also, where does the configuration comes from ? Repository ? Built-In ?

Best regards

Sabrina

Four Stars

Re: tHiveInput is not working in batch jobs (spark) in a kerberos cluster.

We are encountering the same issue. To answer your questions (in our case), Yes, we have selected kerberos configuration in the tHDFSConfiguration and configuration is built-in.

 

Regards,

Erick

Four Stars

Re: tHiveInput is not working in batch jobs (spark) in a kerberos cluster.

Hi Sabrina,

 

Yes we already selected Kerberos in HDFS configuration and reading / writing inside HDFS with batch jobs works. Only problem when it tries to select the data from Hive by using tHiveInput component especially in batch jobs not in standard job.

 

Please clarify does talend uses /etc/spark/conf/ anyway for batch Jobs ??

 

Thanks