Skip to main content Skip to complementary content

Creating a Spark Job for machine learning

This sections explains how to create a Spark Job to develop a machine learning routine.

Procedure

  1. Open Talend Studio and expand Job Designs.
  2. Right-click Big Data Batch and create a new Job specifying Spark as the framework.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!