Skip to main content Skip to complementary content

Creating a Spark Job for machine learning

This sections explains how to create a Spark Job to develop a machine learning routine.


  1. Open Talend Studio.
  2. In the Repository tree view, expand Job Designs.
  3. Right-click Big Data Batch and create a new Job specifying Spark as the framework.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!