Hello guy. i am new in this tool. i try to use this tool in migrating job. but i spent a day in solving unexpected operation in tMap. there are one main flow and lookup flow. when the main flow have a data, this data should map with lookup table in the order of mapping rule in tmap. in tmap, i map the key of main flow with the key in lookup table. so whenever data come from main flow, the lookup table should be loaded with the mapped data according to the mapping logic in tMap. so there are 7 data from lookup table according to the joined key from main table. but my job loads too many data. it seems that the job loads data in memory first and try to map data of lookup with the data of main table. i want to change the job to load data from lookup table with mapped the data of main table according to the mapped rule which i mapped in tmap.
Using these components, the joins are done in memory so all the lookup data is loaded before the main flow runs. If you want to limit the number of rows brought from the lookup, you need to change the lookup model in tMap to "Reload at each row". Note that this is only suitable for limited numbers of main rows. Alternatively, you can look at using the ELT components to push the join to the DB.
Did you even look at his picture? It's configured to do an inner join with reload at each row. syw1489: You have to put your key for the query into the section above the columns ("globalMap key"). For every row the key will be put in the global map and you can use it in your query like SELECT ... FROM A WHERE key = '"+globalMap.get("identifier")+"'