Hello, I have created a template based job to dynamically build hive query and load the data into HDFS as csv using mysql as input and defining the required metadata like database name, table and its related columns, HDFS location along with file name etc. I am able to read the data Hive but when I see the generated file in HDFS it has only one column (and reason is obvious because of design). Here is my job design tmysqlinput - >tmap ->tflowiterate->tjavarow->thiveiput-tmap-thdfsoutput [In the thiveinput a dummy column is added and mapped through thdfs) In mysql I has two rows where one hive table has 2 columns and other has 8 columns. Now I am only seeing 1 row data loaded into target HDFS file. How to dynamically retrieve all the columns from hive and load to hdfs as csv, I'm struck here as hive and hdfs components doesn't support dynamic type. Please advice.
Re: Loading templatized Hive data to HDFS as CSV file
Here is the input i'm reading from mysql Database_Name varchar(255), table_Name varchar(255), columns varchar(255), Target_HDP_Loc varchar(255), Target_HDP_FileName varchar(255) Attached the screenshot of job design..