how to ingest large SAP tables

Five Stars

how to ingest large SAP tables

Hi all,

we have SAP BW objects that contain several 100ks of rows with over 100 fields. We get out of memory exceptions, so increased the memory for the job to 8GB.

However some (very important) tables, have millions of rows and still require more memory (simple calculation tells us >30 GB required). For other components I saw similar posts saying you can enable stream data, but it seems that for the components we used (tSAPADSOInput/TSAPInfoObject). THe question is: is Talend not capable of ingesting this amount of data? If it always loads everything into memory, before writing it, this is a major limitation.

A solution would be to select only a part of the data, but it also seems not to be possible to define a query in the style 'SELECT * FROM table where date> ' for our components. I feel like that a product like Talend should have a workaround, but I might not have encountered it yet.

 

Thanks for input!!

Ten Stars

Re: how to ingest large SAP tables

Count the amount, Use/create rownumber.
Or
Use an id which has a number.

 

next is iteration. 350k records, calculate size, of n records (maximize)
Every iteration, say, you take 25k records: 1-25, 25-50k, etc.

Note:
- Avoid using tMap, store directly into file or whatever, if possible.
- Minimize your extracted string size because that consumes memory.

You could maybe use a different connection approach: because SAP run on top of database engines: most common ones : DB2, MS SQL, Oracle or HANA