Hi have 4 tables in oracle. i am writing a single job that will pull data from oracle tables and dump to hdfs (instead of writing 4 different jobs for 4 different tables).
I am taking the table name from context.
But while pulling the data from oracle i have to statically specify the schema. This is killing my dynamic objective.
Here is the graph i am using:
In tOracleInput i need the "guess schema" part to be dynamic to guess the schemas for all the tables on the fly. Also in this component tOracleInput i have defined the schema with a single column entry with type "Dynamic".
tConvertType is being used to convert the "Dynamic" type to "String" type.
tJavaRow is being used for trimming the data.
and finally dumping the data into hdfs.
This whole setup is not working. the hdfs output file is containing 0 data. It isnt throwing any error though.
Where am i going wrong ?
Solved! Go to Solution.
I have tried to use the below option which you have mentioned. But its loading all the values as "null" in hdfs file. Is there any way to get the source data instead of null?
String row = input_row.newColumn.toString();
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Learn how to make your data more available, reduce costs and cut your build time
Read about OTTO's experiences with Big Data and Personalized Experiences
Take a look at this video about Talend Integration with Databricks