We are using Talend to load data from CSV files to Hortonworks DP 2.6. All jobs were running fine. Now we have enabled Ranger policies and after that all jobs are failing with errors. As similar issue was raised by other users earlier but there is no confirm solution available. We tried few solutions or inputs we got from Talend Community , HDP community but the issue persist. Please help in resolving this issue.
org.apache.hive.service.cli.HiveSQLException: Error while processing statement: Cannot modify mapreduce.framework.name at runtime. It is not in list of params that are allowed to be modified at runtime
Thanks in Advance
By default, the HDP cluster has been configured to set these values as "final" and not modifiable by a job, but a cluster administrator can change them to non-final. This can depend on many factors, such as their network configuration, their security policies, etc.
In this case as you can see in the error you are getting the cluster is thinking that the job is trying to modify a parameter hence the error you were getting.
Have you tried to uncheck the resource manager checkbox in the thive component that you are using? In this way, the scheduler and also the namenode checkbox and that should make that error go away.
Also check and see if you are passing that property through hadoop properties.
Let us know if it is OK with you.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Learn how to make your data more available, reduce costs and cut your build time
Read about OTTO's experiences with Big Data and Personalized Experiences
Take a look at this video about Talend Integration with Databricks