Setting up the Job - Cloud - 8.0

Kafka

Version
Cloud
8.0
Language
English
Product
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Real-Time Big Data Platform
Module
Talend Studio
Content
Data Governance > Third-party systems > Messaging components (Integration) > Kafka components
Data Quality and Preparation > Third-party systems > Messaging components (Integration) > Kafka components
Design and Development > Third-party systems > Messaging components (Integration) > Kafka components
Last publication date
2024-02-29

Set up the Job by linking the components to construct the data flow.

Procedure

  1. In the Integration perspective of Talend Studio, create an empty Spark Streaming Job from the Job Designs node in the Repository tree view.
  2. In the workspace, enter the name of the components to be used and select these components from the list that appears. In this scenario, the components are tSetKeystore, tKafkaInputAvro, tWindow, tFilterRow, tAggregateRow, and tLogRow.
  3. Connect tKafkaInputAvro to tWindow, and then connect tWindow to tFilterRow using the Row > Main link.
  4. Connect tFilteRow to tAggregateRow using the Row > Filter link.
  5. Connect tAggregateRow to tLogRow using the Row > Main link.
  6. Leave the tSetKeyStore component alone without any connection.

Results

The Job is set up and the components are ready to be configured.