Skip to main content

Spark Job

A Talend Spark Job makes use of the Spark framework to process RDDs (Resilient Distributed Datasets) or DSs (Datasets) on top of a given Spark cluster. A Spark Job can be a Spark Streaming or a Spark Batch Job.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!