I have cloudera distribution imported along with the Talend VM. However talend Weave Scope is not showing up the Hive host
i am not very sure whether hive services are started or not?if they need to be started what is the command to execute it in command prompt.
from where can i get the hive host,port ,resource manager uri,name node uri,database,username,password etc of hive cloudera.can you please help finding the below config details.
Note:i have not installed Cloudera manager separately due to memory issues.so as i believe when Cloudera distribution is imported it would install Hive as well.
Solved! Go to Solution.
Are you using the Cloudera Quickstart VM? Your screenshot looks like you are trying to connect to a HortonWorks cluster, not a Cloudera one. In general, when you start the Quickstart VM, it will start the Hive service: the easiest way to check is to pull up Cloudera Manager inside the VM and look for the green circle next to the Hive service.
Here's a tutorial I wrote on how to connect Talend Open Studio to the Quickstart VM. I hope it's what you're looking for:
I have nt installed Cloudera quick start VM.Only installed Talend big data sandbox and cloudera distribution provided by talend.
and trying to connect to Hive with the attached configuration and i am getting the below error.tried with other hosts as well but nothing works out.your help on this greatly appreciated.
Connection failure. You must change the Database Settings.
java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://talend:9083/default: java.net.ConnectException: Connection refused (Connection refused)
Sorry, I don't know enough about that distribution to be of any help. It's saying "connection refused", which means your settings are correct, but the authentication won't let you in. Double-check your username and password to make sure they are correct, then temporarily disable your firewall and try to connect. That's the best I can do from here.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Learn how to make your data more available, reduce costs and cut your build time
Read about OTTO's experiences with Big Data and Personalized Experiences
Take a look at this video about Talend Integration with Databricks