How to handle frequently changing structure of a data source?

Highlighted
Six Stars

How to handle frequently changing structure of a data source?

Hi,
I want to achieve a situation in which data to be moved from source table(RDBMS or flatfile) to target table(S3).

1. Source Data Set -- Let say on day 1, there are 5 columns in source table so want to move these 5 columns into one target table.

But on day 2, 3 new columns are added to same source table. So, I want all these 8 columns of source table will be picked up dynamically without defining or altering any schema. And, all 8 columns will get defined in the same target table without defining or altering any schema.

 

I want to achieve this situation with a dynamic solution. So that no need to define or alter the schema both at source and target end.

 

Can anyone please suggest.....

Employee

Re: How to handle frequently changing structure of a data source?

Hi,

 

    In Talend, the schema of source and target DB cannot be modified dynamically. 

 

Warm Regards,

 

Nikhil Thampi


Warm Regards,
Nikhil Thampi
Please appreciate our members by giving Kudos for spending their time for your query. If your query is answered, please mark the topic as resolved :-)
Six Stars

Re: How to handle frequently changing structure of a data source?

Hi, 
I'm using the commercial version of Talend for BigData Platform and aware about the Dynamic schema functionality as well which provide me the way to cater the n number of columns dynamically.

But I want to achieve if the columns changes in source side then I'll be able to perform some transformations over it then able to store or transfer the to the target with the dynamic schema only.

 

Anyone worked on such requirement previously or suggest......

Cloud Free Trial

Try Talend Cloud free for 30 days.

Tutorial

Introduction to Talend Open Studio for Data Integration.

Definitive Guide to Data Integration

Practical steps to developing your data integration strategy.

Definitive Guide to Data Quality

Create systems and workflow to manage clean data ingestion and data transformation.