I wish to load a table into BigQuery and am using tbigqueryoutput component to do the same. I thought, all that required was the dataset and the table name where it should be loaded and the data that is coming from talend.
But I see there are 2 other different parameters that are requested for
1. Local File Name
2. File on Google Storage Configuration.
My queries are
1. Why would there must be a local file name storing locally the output file to be pushed into bigquery? What do I have to do, if I do not wish to store any such details locally at all.
2. What content does the Google storage configuration exactly contain? Where is this file created?
Because with out this file the loading is not happening. Where in Big Query do I have to put the details of the google storage and what kind of details should exactly be containing into it.
I tried pushing records from talend into a Bigquery, but it keeps asking for the bucket details. Can anyone of you explain what kind of details should be present in the gs://<<bucket_Name>>/<filename_to_be_used_in_talend>>. PFB the screenshot of what parameters do i really mean.
this components put your local file in a bucket in google cloud storage and then import into bigquery table.
API Gloud storage and bucket must be activated and created.
It's clear ?
You need BigQuery & Google Cloud Storage
And for acccess, we need Id client + access key
For Cloud Storage, go to GCP Console and generate key
Hi, it is a file created by me. May I know what will be the content of the file usually. Also, is it possible to load the data directly into Big Query as it seems the data is not stored in Cloud Storage at all. May I know what is to be done in that case?