tDynamoDBConfiguration properties for Apache Spark Streaming - 7.3

Amazon DynamoDB

Version
7.3
Language
English
Product
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Real-Time Big Data Platform
Module
Talend Studio
Content
Data Governance > Third-party systems > Amazon services (Integration) > Amazon DynamoDB components
Data Quality and Preparation > Third-party systems > Amazon services (Integration) > Amazon DynamoDB components
Design and Development > Third-party systems > Amazon services (Integration) > Amazon DynamoDB components
Last publication date
2024-02-21

These properties are used to configure tDynamoDBConfiguration running in the Spark Streaming Job framework.

The Spark Streaming tDynamoDBConfiguration component belongs to the Storage and the Databases families.

This component is available in Talend Real Time Big Data Platform and Talend Data Fabric.

Basic settings

Access key

Enter the access key ID that uniquely identifies an AWS Account. For further information about how to get your Access Key and Secret Key, see Getting Your AWS Access Keys.

Secret key

Enter the secret access key, constituting the security credentials in combination with the access Key.

To enter the secret key, click the [...] button next to the secret key field, and then in the pop-up dialog box enter the password between double quotes and click OK to save the settings.

Region

Specify the AWS region by selecting a region name from the list. For more information about the AWS Region, see Regions and Endpoints.

Use End Point

Select this check box and in the field displayed, specify the Web service URL of the DynamoDB database service.

Advanced settings

Connection pool

In this area, you configure, for each Spark executor, the connection pool used to control the number of connections that stay open simultaneously. The default values given to the following connection pool parameters are good enough for most use cases.

  • Max total number of connections: enter the maximum number of connections (idle or active) that are allowed to stay open simultaneously.

    The default number is 8. If you enter -1, you allow unlimited number of open connections at the same time.

  • Max waiting time (ms): enter the maximum amount of time at the end of which the response to a demand for using a connection should be returned by the connection pool. By default, it is -1, that is to say, infinite.

  • Min number of idle connections: enter the minimum number of idle connections (connections not used) maintained in the connection pool.

  • Max number of idle connections: enter the maximum number of idle connections (connections not used) maintained in the connection pool.

Evict connections

Select this check box to define criteria to destroy connections in the connection pool. The following fields are displayed once you have selected it.

  • Time between two eviction runs: enter the time interval (in milliseconds) at the end of which the component checks the status of the connections and destroys the idle ones.

  • Min idle time for a connection to be eligible to eviction: enter the time interval (in milliseconds) at the end of which the idle connections are destroyed.

  • Soft min idle time for a connection to be eligible to eviction: this parameter works the same way as Min idle time for a connection to be eligible to eviction but it keeps the minimum number of idle connections, the number you define in the Min number of idle connections field.

Usage

Usage rule

This component is used with no need to be connected to other components.

The configuration in a tDynamoDBConfiguration component applies only on the DynamoDB related components in the same Job. In other words, the DynamoDB components used in a child or a parent Job that is called via tRunJob cannot reuse this configuration.

This component, along with the Spark Streaming component Palette it belongs to, appears only when you are creating a Spark Streaming Job.

Note that in this documentation, unless otherwise explicitly stated, a scenario presents only Standard Jobs, that is to say traditional Talend data integration Jobs.