my first real day with Talend Opern Studio, so please be patient with me :-)
Call a REST API with three parameters: starte_date and end_date (current_date) as well as an id for requesting the next rows if there are more than 500. The API will only give me max. 500 datasets as a JSON, so i should pass the id of the last row for the next call. (it will then give me the rows starting with the next id after the provided one).
At the end i want to add all received rows to a mysql table.
What i have so far:
tJava_1: i set the start_date and end_date parameters to the todays date via Java code, resulting in e.g. "2018-12-10" for this day. I also want to set two values id and continueLoop. The id variable to be set to the last received id to be used with the next call (loop).
globalMap.put("start_date",currentDate); globalMap.put("end_date",currentDate); globalMap.put("id","0"); globalMap.put("continueLoop",true);
tRESTClient1: The API Call with the three query parameters
tExtractJsonFields: To map the results (see screenshot). Im unsure if the count value can be received this way (a level outside the given "Loop Jsonpath query")
This is where i'm currently lost. The plan was to store the id of the last received dataset for a possible next call and to set the "continueLoop" variable to false if the count value is unequal to 500.
// here is the main part of the component, // a piece of code executed in the row // loop globalMap.put("id",row3.id); System.out.println(row3.count); System.out.println(row3.id);
// end of the component, outside/closing the loop System.out.println("Last Row"+ row3.id); if(!row3.count.equals("500")) globalMap.put("continueLoop",false);
Currently i receive the error shown in the corresponding screenshot.
I appreciate any help from you guys; Thanks in advance for your time!
A short update on this. I managed to solve the error. It was caused by the Mapping of the "count" variable in tExtractJSONFields_1.
So i know did manage to receive all the rows within a loop.
Now: How can i get the rows of data into a Database (e.g. MySQL or BigQuery?).
Join us live for a sneak peek!
Accelerate your data lake projects with an agile approach
Create systems and workflow to manage clean data ingestion and data transformation.
Introduction to Talend Open Studio for Data Integration.