Hi have 4 tables in oracle. i am writing a single job that will pull data from oracle tables and dump to hdfs (instead of writing 4 different jobs for 4 different tables).
I am taking the table name from context.
But while pulling the data from oracle i have to statically specify the schema. This is killing my dynamic objective.
Here is the graph i am using:
In tOracleInput i need the "guess schema" part to be dynamic to guess the schemas for all the tables on the fly. Also in this component tOracleInput i have defined the schema with a single column entry with type "Dynamic".
tConvertType is being used to convert the "Dynamic" type to "String" type.
tJavaRow is being used for trimming the data.
and finally dumping the data into hdfs.
This whole setup is not working. the hdfs output file is containing 0 data. It isnt throwing any error though.
Where am i going wrong ?
Solved! Go to Solution.
I have tried to use the below option which you have mentioned. But its loading all the values as "null" in hdfs file. Is there any way to get the source data instead of null?
String row = input_row.newColumn.toString();
Introduction to Talend Open Studio for Data Integration.
Practical steps to developing your data integration strategy.
Create systems and workflow to manage clean data ingestion and data transformation.