We have job fetching data from Oracle table loading to google big query, Is it possible to push the mapping logic to google database side ?
Is pushdown optimization available in talend?
Are all the source and target schema different? Do you want to load server target tables from serveral input files having one-to-one mapping? Is each file structure different? More information will be preferred.
Source-target schema can be same but different database, as source is oracle database and target BigQuery.
Basically we want to push most of the transformation logic to the Google BigQuery engine side for better performance. As in traditional ETL tools we used to do.