Converting Jobs - Cloud - 7.3

Talend Studio User Guide

Version
Cloud
7.3
Language
English
Product
Talend Big Data
Talend Big Data Platform
Talend Cloud
Talend Data Fabric
Talend Data Integration
Talend Data Management Platform
Talend Data Services Platform
Talend ESB
Talend MDM Platform
Talend Real-Time Big Data Platform
Module
Talend Studio
Content
Design and Development
Last publication date
2024-02-13
Available in...

Big Data

Big Data Platform

Cloud Big Data

Cloud Big Data Platform

Cloud Data Fabric

Data Fabric

Real-Time Big Data Platform

You are able to create a Job by converting it from a different framework, such as from Standard to Spark.

This option is recommended if the components used in a source Job are also available to the target Job. For example, if you need to convert a Standard Job to a Spark Batch Job.

Procedure

  1. Right-click the Standard Job you want to convert and from the contextual menu, select Edit properties. Then the Edit properties window is displayed.
  2. If you need to modify the descriptive information of the Job, make the changes in the corresponding fields.
    If you need to change the information in the uneditable fields, you have to use the Project settings wizard to make the desired changes. For further information, see Customizing project settings.

    Example

  3. From the Job Type list, select Big Data Batch, and from the Framework list, select MapReduce. Click Finish to close this window.

Results

Then the converted Job appears under the Big Data Batch node.

Note that you can also select the Duplicate option from the contextual menu to access the conversion; this approach allows you to keep the source Job with its original framework but create the duplicate one on the target framework.

You can as well repeat this procedure from a MapReduce Job to convert it to a Standard Job or to other frameworks. For this purpose, the option to select from the contextual menu is Edit Big Data Batch properties.

If you are using in the Standard Job a connection to Hadoop that has been defined in the Repository, then once you click the Finish button, the Select node wizard is displayed to allow you to select this connection and thus to automatically reuse this connection in the MapReduce Job to be created.

Each component appears in this wizard is being used by the source standard Job. By selecting it, you are reusing the Hadoop connection metadata it contains for the MapReduce Job you are creating.