I have a large bunch of rows (read from a file) and I'm managing it with some elaboration.
The elaboration takes a time not proportional to the number of row, but exponential somehow. The first 10000 rows are elaborated in 2 minutes (let's say), the second 10000 in 5-6 minutes ... and so on.
My job is designed to write on DB the result, on a SubJobOk link, that is followed when all rows parsed (from file) and elaborated.
How could I split a result set (let say 40k rows) in blocks of 10k rows, like a loop?
Ideally, I would need a result-set splitter, a component that takes in input a row link and supply in output 4 times (consecutively) 10k rows.
Any idea how to achieve this task?
Solved! Go to Solution.