The information in this section is only for users of File or of any Big Data. It is also only for users who run their Spark Jobs on Databricks distributions, both on Azure and AWS.
When you run a Job on an all-purpose cluster in Talend Studio, you can basically run any workload. Interactive clusters are created for an undetermined duration, but you can manually terminate and restart them if needed. Multiple users can share such clusters to do collaborative and interactive analytics.
When you run a Job on a job cluster in Talend Studio, you process the Job faster and the cluster automatically shuts down and when processing is finished for a lower cost of usage. Job clusters are created according to your Spark configuration and you cannot restart them once shut down.