Getting started with a Storm Job

Talend Data Fabric Getting Started Guide

Talend Data Fabric
Design and Development
Data Quality and Preparation
Data Governance
Talend Studio

The rest of this section presents more details of this template by explaining a series of actions to be taken to create a Storm Job, or in other words, a Storm topology.

For further information about the architecture on top of which a Talend Storm Job runs and as well about other related advanced features, see Talend Studio User Guide.

  1. Once the Studio is launched, in the Repository of the Integration perspective, right-click the Job designs node, and then from the contextual menu, select Create Big Data Streaming Job.

    The [New Big Data Streaming Job] wizard opens to help you define the main properties of the new Job.

  2. Fill the Job properties as shown in the previous screenshot.

    The fields correspond to the following properties:




    the name of the new Job.

    Note that a message comes up if you enter prohibited characters.


    Select the computation framework you need to use to create the Job. This framework can be, for example, MapReduce or Spark.


    Job purpose or any useful information regarding the Job use.


    Job description containing any information that helps you describe what the Job does and how it does it.


    a read-only field that shows by default the current user login.


    a read-only field that shows by default the login of the user who owns the lock on the current Job. This field is empty when you are creating a Job and has data only when you are editing the properties of an existing Job.


    a read-only field. You can manually increment the version using the M and m buttons.


    a list to select from the status of the Job you are creating.


    a list to select from the folder in which the Job will be created.

  3. Once you click Finish, an empty design workspace opens up showing the name of the Spark Streaming Job as a tab label.