One Star

Storing rows in an in-memory table

Hi everyone
Let's suppose I have a job where the entire contents of a table is recreated daily with data from a database which is not very available (frequent downtime and slow connection).
How can I erase the local table ONLY if reading all the remote data succeeds?
I want to avoid the situation that the local table is only partially populated if there are timeouts or disconnections during the map operation, which can take very long.
I thought of reading the remote table and putting the data into a temporary in-memory store, and only if this succeeds then proceeding with the map operation.
Is it possible?
thank you
4 REPLIES
Moderator

Re: Storing rows in an in-memory table

Hi,
Do you want to delete your table by timing? and trigger the subjob if the remote data succeeds? Could you elaborate your case with an example?
Best regards
Sabrina
--
Don't forget to give kudos when a reply is helpful and click Accept the solution when you think you're good with it.
Seventeen Stars

Re: Storing rows in an in-memory table

What about the good old transactions? I would do anything within a transaction which can only fail and does not leaf inconsistent data or succeed completely.
One Star

Re: Storing rows in an in-memory table

You are right jlolling, I forgot transactions!
Using transactions means that I must use the open/commit transaction components, and if something fails after open, it auto rollbacks, isn't it?
thanks
enrico
Seventeen Stars

Re: Storing rows in an in-memory table

Yes thats the normal way. To use transaction in Talend you have to use the connection component and the commit and rollback components. In your input and output component set the usage of external connection.
If your transaction will be growing to large, you have to build chunks of data to proceed.