Hi Could someone help me please. I created my jobs in Talend on my machine connecting to a server and created new lookup tables so everything was working with no problem since the tables didnt have a lot of records in them. Since everything is working fine i am now ready to use the right server and database to process my jobs. I am using Tmap with the lookup tables and an inner join: Unique match etc. My lookup tables have more than 6000 000 rows and this seems not to be working. My first job i have changed to connect to the right database on the server seem to be stuck when loading the lookup table rows. It is just stuck it keeps saying Reading as the status and no rows loaded at all. Before i was getting the out of memory error message and when i increased the virtual memory on my machine i am now not getting any records loaded at all, it just keeps starting and nothing loaded at all. Could someone help me please as i cant seem to have this run at all . If you look you will see the bridge drug codes just says starting and nothing happens. The Therapy(next lookup table) and bridge drug codes are my largest tables. i have tried to attach the print screen and hope you can help me Many thanks
Right i am not so good at this talend business. I have clicked the store to temp disk, am i to allocate the folder in the components part. Does it automatically create it or this is to specify a certain temp folder?
I'm having de same problem, I have a tMap where a lookup is done for a huge table. I already change in the tMap to storage in temporary files, and increase the heap size of in Talend. Any idea of what else I can do to improve the usage of memory? Thanks, Valentina
You can create a lookup query like this: select id, foo, bar from tablename where id ="+globalMap.get("getId") where the field getId is a input field from the Main of the tMap. then in the tMap set Lookup at Each row This way only one record is selected. The disadvantage is that there will be a database connection for each row in your Main. Good Luck
when you have a huge data file, it's recommended, before the process, to insert all the lines of your file into a database. It's more easier to optimizing your job (cursor, indexes, ...). it's the best practice of the tMap Option "reload at each rows".