I am new to Talend & BigQuery.
I tried below steps.
1) I am able to load data using tBigQueryinput & tBigQueryOutput components. Here i filled all required fields like below
Authentication Mode : Service Account
Service account credential file (json file)
Project ID, Dataset, Table
Google storage configuration (Access Key, Secret Key, Bucket ,File)
Using above details i am able to load the data(Ref.Manual_Connection.gif).
But if i am keeping Insert or Update Query field in tBigQuery, It throws the error. Can we run Insert/Update in simple transformation?
2) Tried using simba JDBC drivers given by google cloud(Below are provide links)
but not able to write data into Big Query tables. Please help me on it.
Created jdbc connection string like this(Ref:JDBC Drivers_Connection). Here we are able to connect biqquery metadata and able to drag tables into Components page. But while running using below jdbc connection getting error
[Simba][JDBC](10220) Driver does not support this optional feature.
Please help me.
Thanks in Advance,
Our Google BigQuery components rely on Google "SQL". This is a subset of SQL, that limits actions to simple selections.
Google now offers "Standard SQL" in their Data Manipulation Language (DML), which features extended actions (such as insert, update, delete operations).
For some examples of their DML cf. https://cloud.google.com/bigquery/docs/reference/standard-sql/dml-syntax
Here exists a jira issue:https://jira.talendforge.org/browse/TBD-5959.
As per Jira,
To run Run this job: <BigQueryInput> + RowComp (create_table, drop_table, trunc_table) + Output (including sub-queries when applicable.
To define Row(tDBRow) Component, We need to define database component. Using Simba jdbc drivers, created jdbc connectivity and trying to write data into the BigQuery. Which was mentioned in step2 in the 1st communication. Since I am new to talend, am I doing correct way. Using same drivers I was able to write data from Pentaho open etl tool. But from talend, I am not able to write data. Please correct me if am doing any wrong.
Join us live for a sneak peek!
Accelerate your data lake projects with an agile approach
Create systems and workflow to manage clean data ingestion and data transformation.
Introduction to Talend Open Studio for Data Integration.