Skip to main content

Designing Spark Jobs

Using the Spark-specific components, a Talend Spark Job makes use of the Spark framework to process RDDs (Resilient Distributed Datasets) on top of a given Spark cluster.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!