Have you tried to free up some space in your database? Did you enable "Use Batch Size" option in tMSSqlOutput component?
Are you using oracle on Azure?
Could you please have a look at this article about:https://stackoverflow.com/questions/44188234/azure-exceeded-the-memory-limit-of-20-mb-per-session-fo...?
Have you already checked this article to see if it helps?
Hi, You got any solution for this 20 MB size limit. We also getting same issue, but when loading a bigger file like more than 100 MB file. Able to load 35 files with size less than 100 MB. When tried with 380 MB file getting same error. Tried with various batch size as 1000/500/100/10, still no luck. Please share if you got any solution. We also loading data to Azure DB from Talend enterprise edition.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Learn how to do cool things with Context Variables
Find out how to migrate from one database to another using the Dynamic schema
Pick up some tips and tricks with Context Variables