Configuring a Spark stream for your Apache Spark streaming Job - 7.1

MongoDB

author
Talend Documentation Team
EnrichVersion
Cloud
7.1
EnrichProdName
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Open Studio for Big Data
Talend Real-Time Big Data Platform
task
Data Governance > Third-party systems > Database components > MongoDB components
Data Quality and Preparation > Third-party systems > Database components > MongoDB components
Design and Development > Third-party systems > Database components > MongoDB components
EnrichPlatform
Talend Studio
Define how often your Spark Job creates and processes micro batches.

Procedure

  1. In the Batch size field, enter the time interval at the end of which the Job reviews the source data to identify changes and processes the new micro batches.
  2. If needs be, select the Define a streaming timeout check box and in the field that is displayed, enter the time frame at the end of which the streaming Job automatically stops running.