We have requirement of submitting multiple spark big data batch jobs on single Azure Databricks cluster. We tried submitting more than one job at the same time to single databricks cluster, but jobs start failing with the issue :
“org.talend.bigdata.launcher.utils.BigDataLauncherException: Status 400 for /api/2.0/clusters/restart”
After some analysis I figured out that as per the talend official documentation, we can’t run multiple big data batch jobs in parallel on single databricks cluster. Please find the link below:
Is there a way I can run big data jobs in parallel on a single databricks cluster?
Could you please create a new feature jira issue on talend bug tracker?
Talend named a Leader.
Kickstart your first data integration and ETL projects.
This video focuses on different methods of adding metadata to a job in Talend Cloud
This video will show you how to add context parameters to a job in Talend Cloud
This video will show you how to run a job in Studio and then publish that job to Talend Cloud