Hi. I have a Talend Open Studio job that I inherited that posts a dataset to a webservice via HTTP post. In order to throttle the amount of data being sent to the web service, the job utilizes a tLoop to divvy up the dataset into batches of 100 records, which are then sent to a tRESTClient component to do the post (see attached screen shot).
Lately, I have seen times where the web service times out. Is it possible to create logic to automatically retry the connection x number of times before timing out and killing the job? I was hoping there was a way to create the initial web service connection and putting some retry logic around that. Then, if successful, point the tRESTClient connection to the existing connection. However, it doesn't appear that this is possible.
Any help is greatly appreciated!
Web services don't work on a connection basis like a database. You make a call to the service and it is either accepted or not. Your idea of retrying though, is perfectly reasonable. I am not sure how your job is looping from looking at the image, but if you can identify if a batch has been sent or not, you can implement retry logic. But this is not always straight forward. Essentially your loop is running from left to right with every iteration. So at the end (on the very right) you know at that point whether the batch was successful or not. You can make use of globalMap variables to keep a state. So if the batch fails, when the loop next fires you can start the iteration knowing that the previous batch did not fire, so resend it. To do this you will need a mechanism to allow you to send the batch again, you will also need logic for deciding when to give up trying. It looks like you are actually half the way there with what you have already if I am honest.
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
Pick up some tips and tricks with Context Variables
Learn how media organizations have achieved success with Data Integration
Create systems and workflow to manage clean data ingestion and data transformation.