Managing the Column Names of a Delimited Input File

Four Stars

Managing the Column Names of a Delimited Input File

Hi all,

 

I have a Talend job that is responsible for reading varied input files/layouts and loading tables in SQL Server.

  • The schema is defined as a dynamic (I'm using the Enterprise version)
  • I have metadata stored in the SQL Server DB on every input to the solution and have tSetDynamicSchema in the Talend job as well

 

The challenge I am facing is you don't have the "Use existing dynamic" option in tFileInputDelimited, only in tFileInputPositional.

This creates a problem when the header record column names don't match to the expected naming convention that I am storing in the DB as I use the metadata from the first job into the next job and so on.

 

Is there a workaround for this?

Is there a way to substitute a new header based on my input metadata that I keep in the DB?

Other ideas?

Moderator

Re: Managing the Column Names of a Delimited Input File

Hello,

It could be a new feature.

To work with tSetDynamicSchema, components should have the option of "Use existing dynamic" and currently only 2 of our components are supported tfileinputpositional and tfileoutputpositional.

Here exists a new feature jira issue:https://jira.talendforge.org/browse/TDI-29723

Best regards

Sabrina

--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Best Practices for Using Context Variables with Talend – Part 4

Pick up some tips and tricks with Context Variables

Blog

How Media Organizations Achieved Success with Data Integration

Learn how media organizations have achieved success with Data Integration

Read

Definitive Guide to Data Quality

Create systems and workflow to manage clean data ingestion and data transformation.

Download