We have requirement of submitting multiple spark big data batch jobs on single Azure Databricks cluster. We tried submitting more than one job at the same time to single databricks cluster, but jobs start failing with the issue :
“org.talend.bigdata.launcher.utils.BigDataLauncherException: Status 400 for /api/2.0/clusters/restart”
After some analysis I figured out that as per the talend official documentation, we can’t run multiple big data batch jobs in parallel on single databricks cluster. Please find the link below:
Is there a way I can run big data jobs in parallel on a single databricks cluster?
Could you please create a new feature jira issue on talend bug tracker?
Talend named a Leader.
Kickstart your first data integration and ETL projects.
Watch the recorded webinar!
This video will show you how to run a job in Studio and then publish that job to Talend Cloud
This video will help someone new to using Talend Studio get started by connecting to Talend Cloud and fetching the Studio License
The Talend Cloud Developer Series was created to give you a solid foundational understanding of Talend’s Cloud Integration Platform