tELTSAPInput Standard properties - 7.3

SAP

Version
7.3
Language
English
Product
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Data Integration
Talend Data Management Platform
Talend Data Services Platform
Talend ESB
Talend MDM Platform
Talend Real-Time Big Data Platform
Module
Talend Studio
Content
Data Governance > Third-party systems > Business applications > SAP components
Data Quality and Preparation > Third-party systems > Business applications > SAP components
Design and Development > Third-party systems > Business applications > SAP components
Last publication date
2024-02-21

These properties are used to configure tELTSAPInput running in the Standard Job framework.

The Standard tELTSAPInput component belongs to the ELT family.

The component in this framework is available in all Talend products.

Basic settings

Schema and Edit schema

A schema is a row description. It defines the number of fields (columns) to be processed and passed on to the next component. When you create a Spark Job, avoid the reserved word line when naming the fields.

  • Built-In: You create and store the schema locally for this component only.

  • Repository: You have already created the schema and stored it in the Repository. You can reuse it in various projects and Job designs.

Click Edit schema to make changes to the schema. If the current schema is of the Repository type, three options are available:

  • View schema: choose this option to view the schema only.

  • Change to built-in property: choose this option to change the schema to Built-in for local changes.

  • Update repository connection: choose this option to change the schema stored in the repository and decide whether to propagate the changes to all the Jobs upon completion. If you just want to propagate the changes to the current Job, you can select No upon completion and choose this schema metadata again in the Repository Content window.

Default Table Name

Enter the name of the input table between double quotation marks.

Mapping
Specify the metadata mapping file for the database to be used. The metadata mapping file is used for the data type conversion between database and Java. For more information about the metadata mapping, see the related documentation for Type mapping.
Note: You can use Hive mapping to support Databricks Delta Lake.

Note that this drop-down list is disabled, and by default, Mapping SAP is selected and the SAP metadata mapping file is used.

Advanced settings

tStatCatcher Statistics

Select this check box to gather the Job processing metadata at the Job level as well as at each component level.

Global Variables

ERROR_MESSAGE

The error message generated by the component when an error occurs. This is an After variable and it returns a string.

Usage

Usage rule

The tELTSAPInput component is more commonly used along with the tELTSAPMap component. The name of the link from tELTSAPInput to tELTSAPMap must be same as the table name specified in the Default Table Name field of tELTSAPInput.

Support

tELTSAPInput should be used to interact with the ERP part of SAP, including S4/HANA.