Spooling n-Records before Executing a Post Using tRest

Five Stars

Spooling n-Records before Executing a Post Using tRest

TLDR: I need to spool 100 records from a larger file and then execute a post against my API using tRest.

 

I have been trying to figure out how I would execute this scenario:

     Currently I am posting 1 record as a Post to my tRest component using an iterate connector, the service returns the response and I continue processing.

What I need to do is post as many as 100 records to the API in a chunked fashion from the source file (250K lines), I need to spool up to 100 records then post them. 

 

Current Flow

File Input (500K Lines) --> tMap -->Iterate --->tRest (post single) --> tHashOutPut (to Store response)

 

Future Flow

File Input (500K Lines) --> tMap -->Spool-100-Lines --->tRest (post array of 100) --> tHashOutPut (to Store response)

 

 

 

 

 

Carolus Holman
Tags (1)

Calling Talend Open Studio Users

The first 100 community members completing the Open Studio survey win a $10 gift voucher.

Start the survey

2019 GARNER MAGIC QUADRANT FOR DATA INTEGRATION TOOL

Talend named a Leader.

Get your copy

OPEN STUDIO FOR DATA INTEGRATION

Kickstart your first data integration and ETL projects.

Download now

What’s New for Talend Summer ’19

Watch the recorded webinar!

Watch Now

Best Practices for Using Context Variables with Talend – Part 4

Pick up some tips and tricks with Context Variables

Blog

How Media Organizations Achieved Success with Data Integration

Learn how media organizations have achieved success with Data Integration

Read

Agile Data lakes & Analytics

Accelerate your data lake projects with an agile approach

Watch