Skip to main content Skip to complementary content

Configuring a Spark stream for your Apache Spark streaming Job

Define how often your Spark Job creates and processes micro batches.


  1. In the Batch size field, enter the time interval at the end of which the Job reviews the source data to identify changes and processes the new micro batches.
  2. If needs be, select the Define a streaming timeout check box and in the field that is displayed, enter the time frame at the end of which the streaming Job automatically stops running.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!