Configuring a Spark stream for your Apache Spark streaming Job - Cloud - 8.0

MongoDB

Version
Cloud
8.0
Language
English
Product
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Real-Time Big Data Platform
Module
Talend Studio
Content
Data Governance > Third-party systems > NoSQL components > MongoDB components
Data Quality and Preparation > Third-party systems > NoSQL components > MongoDB components
Design and Development > Third-party systems > NoSQL components > MongoDB components
Last publication date
2024-02-20
Define how often your Spark Job creates and processes micro batches.

Procedure

  1. In the Batch size field, enter the time interval at the end of which the Job reviews the source data to identify changes and processes the new micro batches.
  2. If needs be, select the Define a streaming timeout check box and in the field that is displayed, enter the time frame at the end of which the streaming Job automatically stops running.