How to load Hive tables

Five Stars

How to load Hive tables

Hi all,

I'm trying to load from 3 different Hive tables which are in DB1 to target table which is in DB2. Source and target are Hive.

I'm unable to load using thiveload component after doing some joins with tmap. I tried it with ELT components but it works only when my source and target are in same DB. My target tables in Hive DB are internal tables and in ORC format.

My current job design is

thiveinput->thdfsoutput->thiveload.

I guess this is not the optimized way of designing my job. I'm developing a Big Data standard job. Can anyone provide me a way of designing a better job design in Big Data standard job.

For Example: toracleinput->tmap->toracleoutput. Is there any way of doing it in talend data fabric 6.3.

I found Hive input and output are available only in Big Data Batch jobs. 

Moderator

Re: How to load Hive tables

Hello,

ELT components have only one connection that means ELT component can only be used in the same database connection.

Can you find Hive input in a Big Data standard job?

Best regards

Sabrina1.png

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Highlighted
Five Stars

Re: How to load Hive tables

Hi Sabrina

Thanks for your reply. 
I mentioned my job design in my earlier post. I'm didnt find thiveoutput component in a standard job rather it is available in a batch job. So technically in a standard job i need to push to hdfs and after that to hive using a thdfsoutput component and(oncomponentok) a thiveload.

 

Regards

SS

15TH OCTOBER, COUNTY HALL, LONDON

Join us at the Community Lounge.

Register Now

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Put Massive Amounts of Data to Work

Learn how to make your data more available, reduce costs and cut your build time

Watch Now

How OTTO Utilizes Big Data to Deliver Personalized Experiences

Read about OTTO's experiences with Big Data and Personalized Experiences

Blog

Talend Integration with Databricks

Take a look at this video about Talend Integration with Databricks

Watch Now