Yes you can if your java component has schema defined to be inserted into db
then simply connect your data flow to db_output component and configure the db_output with same schema and valid
configuration of other values and you are good to go.
Connect your tRestClient to a tJavaFlex and use this code (assuming the row is "row1")....
You should see the JSON from the webservice call printed to the console window now. If not, your tRestClient is not returning anything.
It might help to see a screenshot of your job AND a screenshot of your tRestClient's output schema. Sometimes the output schema can become corrupt
....was to show you how to use the JSON in the tJavaFlex. I don't want to see the JSON. Was what you posted achieved by using the above code?
Are you saying that this helped you get hold of the JSON string? I'm not sure I understood what you meant in your post. Do you have a colleague who can help with the Java? I ask because it is very difficult to debug code remotely. It might make more sense to work with a colleague on this to share skills
Get the job working and then come back if you have any questions on potential improvements. At the moment, there is nothing to improve.
@rhall_2_0: This is how my job looks like:
Where tJavaflex looks like this:
Analytics analyticsAPI = new Analytics();
analyticsAPI.parseJsonData(row2.string, (String) globalMap.get("tempFileURI",student_id,sis_user_id,course_id,course_code));
It actually calls routine Analytics and returns the output in .csv file which I am then putting into S3 using PutLadedData. Once this job is successful then another subjob runs and copies all the files from S3 to Redshift using below steps:
Is there anything I can change or seems correct?
It looks OK, but if you are running the job that writes the data to a file in parallel, you will have to make sure that it is one file with a unique filename per parallel iteration. You cannot write to the same file in parallel.
Introduction to Talend Open Studio for Data Integration.
Practical steps to developing your data integration strategy.
Create systems and workflow to manage clean data ingestion and data transformation.