How to handle 1000 fields mapping in tMap

Highlighted
Six Stars

How to handle 1000 fields mapping in tMap

Hello,

I have requirement from client like 3 sources (txt and csv files)we have.

After few transformation and filtration I have to join and map these sources together.

In the final output that file conatin 1000 fields.

But when I'm trying to mention these 1000 fields in tMap then it got hang and its very difficult to map over there.

Is any one have any solution for that?

 

Community Manager

Re: How to handle 1000 fields mapping in tMap

1000 fields is a lot of fields! Do you really need all of them? If so, you should be able to do this, but there are no tricks. How much RAM does your machine have? Have you tried editing the Studio .ini file to increase the RAM used by the Studio?

https://community.talend.com/t5/Migration-Configuration-and/Allocating-more-memory-to-Talend-Studio/...

Six Stars

Re: How to handle 1000 fields mapping in tMap

I have already allocated enough memory to talend studio and updated the .ini file. But still I'm struggling to map and provide the transformation logic in tMap. Is any alternative method to solve this kind of issue. I dont want to use any databases related stuff like big SQL query or any stored procedure.

Community Manager

Re: How to handle 1000 fields mapping in tMap

I need to know a bit more about this. How is the job crashing? Does it just hang or does it close? Does the tMap actually show all of the columns you want to map? 

 

It sounds like a memory issue. How much memory does your machine have? Keep in mind that 16GB RAM is what I would recommend as a minimum for a machine running the Studio.  

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Best Practices for Using Context Variables with Talend – Part 1

Learn how to do cool things with Context Variables

Blog

Migrate Data from one Database to another with one Job using the Dynamic Schema

Find out how to migrate from one database to another using the Dynamic schema

Blog

Put Massive Amounts of Data to Work

Learn how to make your data more available, reduce costs and cut your build time

Watch Now