Setting up the Job - Cloud - 8.0

Kafka

Version
Cloud
8.0
Language
日本語
Product
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Open Studio for Big Data
Talend Real-Time Big Data Platform
Module
Talend Studio
Content
ジョブデザインと開発 > サードパーティーシステム > メッセージングコンポーネント > Kafka
データガバナンス > サードパーティーシステム > メッセージングコンポーネント > Kafka
データクオリティとプレパレーション > サードパーティーシステム > メッセージングコンポーネント > Kafka
Last publication date
2023-09-14

Set up the Job by linking the components to construct the data flow.

Procedure

  1. In the Integration perspective of Talend Studio, create an empty Spark Streaming Job from the Job Designs node in the Repository tree view.
  2. In the workspace, enter the name of the components to be used and select these components from the list that appears. In this scenario, the components are tSetKeystore, tKafkaInputAvro, tWindow, tFilterRow, tAggregateRow, and tLogRow.
  3. Connect tKafkaInputAvro to tWindow, and then connect tWindow to tFilterRow using the Row > Main link.
  4. Connect tFilteRow to tAggregateRow using the Row > Filter link.
  5. Connect tAggregateRow to tLogRow using the Row > Main link.
  6. Leave the tSetKeyStore component alone without any connection.

Results

The Job is set up and the components are ready to be configured.