Hi guys, I'm working on some Talend job for our ETL process. So for my data pipeline I need to fetch data from Mysql, after have some transformation and results I want to upload to BigQuery. The problem that I've faced now, that I need to save Date types, in case of BigQuery that mean Timestamp type. But Talend doesn't support timestamp type, and if I use date, tBigQueryOutput component generates table meta description with String type instead of Date or Timestamp. The question is how to do it properly, to actually have my data with Timestamp type, because otherwise I couldn't work with that column like with date column? I mean for now I'm only missing correct type for tBigQueryOutput component. I was able to import manually already imported to Google Cloud Storage file. Thanks for your answers!
Re: [resolved] Uploading to BigQuery timestamp data
So do I need upload files to BigQuery by running console command to execute console uploader? I mean for that case, Talend doesn't do anything. I would even need to download console utility for Talend and so on? I there a way just to add correct data type to importer? I've added required data transformation for data I'm receiving from MySQL. The question is only to generate correct header for BigQuery table. Yes, currently I can export data I upload to Google Cloud Storage without any changes, only changing schema with field type from STRING to TIMESTAMP.