I have development experience with Talend standard jobs but I am new to developing Big Data Batch Jobs in Talend. I have used tschemacompliance to validate schema data types and lengths in standard jobs in Talend. I am looking for an equivalent component in Big Data Batch job. Can you please help me understand how best we can achieve data type and length validations in a batch job? Looking forward for a timely response.
Thanks and Regards,
The tschemacompliance component is a DI component and will not available in bigdata spark job, so far.
Have you tried to utilize a DI Job to orchestrate your spark job by using tRunJob component?
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Learn how to make your data more available, reduce costs and cut your build time
Read about OTTO's experiences with Big Data and Personalized Experiences
Take a look at this video about Talend Integration with Databricks