I have a Talend job that is responsible for reading data from a SELECT query and create a table to store the data from the query output on various types of database such as SQL Server, Redshift, BigQuery. The SELECT query is varied for each run and is stored as a global variable passed from a central Meta DB; it gets data from different sources and can result in various schema. The job includes 2 components, tDBInput with a Dynamic Column schema, and tDBOutput with drop/re-create table as needed from the Dynamic Column schema.
I can make it works for most target databases but BigQuery because both tDBInput and tDBOuput does not have BigQuery in the Database drop-out.
I've explored other workarounds:
Has anyone experienced the same issue with BigQuery? Is there any workaround to add BigQuery into tDBInput and tDBOutput components?
I'm using Talend 7.1 Enterprise version, Simba driver 4.2
tBigQueryXXX components don't support for dynamic schema.
Could you load multiple files with different schema structure into tFileOutputDelimited first then migrating data from one .csv file(tFileInputFullRow) to tBigQueryOutput component?
Used JDBC driver (Simba) in tDBInput & tDBOutput instead of tBigQueryInput & tBigQueryOutput to connect Bigquery , are these components support DML actions like delete , insert and update. I am trying a job to run multiple dml statements run sequentially.
As we don't see tDBRow in tBigQuery components, I am unable to run DML statements.
My Target is execute DELETE STATMENT,INSERT STATEMENT & UPDATE statement in a single talend job.
Please help me.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Learn how to do cool things with Context Variables
Find out how to migrate from one database to another using the Dynamic schema
Pick up some tips and tricks with Context Variables