tELTOracleOutput Standard properties - Cloud - 8.0

ELT Oracle

Version
Cloud
8.0
Language
English
Product
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Data Integration
Talend Data Management Platform
Talend Data Services Platform
Talend ESB
Talend MDM Platform
Talend Real-Time Big Data Platform
Module
Talend Studio
Content
Data Governance > Third-party systems > ELT components > ELT Oracle components
Data Quality and Preparation > Third-party systems > ELT components > ELT Oracle components
Design and Development > Third-party systems > ELT components > ELT Oracle components
Last publication date
2024-02-20

These properties are used to configure tELTOracleOutput running in the Standard Job framework.

The Standard tELTOracleOutput component belongs to the ELT family.

The component in this framework is available in all Talend products.

Basic Settings

Action on data

On the data of the table defined, you can perform the following operation:

Insert: Add new entries to the table. If duplicates are found, the Job stops.

Update: Updates entries in the table.

Delete: Deletes the entries which correspond to the entry flow.

MERGE: Updates and/or adds data to the table. Note that the options available for the MERGE operation are different to those available for the Insert, Update or Delete operations.

Note:

Following global variables are available:

  • NB_LINE_INSERTED: Number of lines inserted during the Insert operation.

  • NB_LINE_UPDATED: Number of lines updated during the Update operation.

  • NB_LINE_DELETED: Number of lines deleted during the Delete operation.

  • NB_LINE_MERGED: Number of lines inserted and/or updated during the MERGE operation.

Schema and Edit schema

A schema is a row description, it defines the number of fields to be processed and passed on to the next component. The schema is either built-in or remotely stored in the Repository.

Click Edit schema to make changes to the schema. If the current schema is of the Repository type, three options are available:

  • View schema: choose this option to view the schema only.

  • Change to built-in property: choose this option to change the schema to Built-in for local changes.

  • Update repository connection: choose this option to change the schema stored in the repository and decide whether to propagate the changes to all the Jobs upon completion.

    If you just want to propagate the changes to the current Job, you can select No upon completion and choose this schema metadata again in the Repository Content window.

 

Built-in: The schema is created and stored locally for this component only. For more information about a component schema in its Basic settings tab, see Basic settings tab.

 

Repository: The schema already exists and is stored in the Repository, hence can be reused. For more information about a component schema in its Basic settings tab, see Basic settings tab.

Where clauses (for UPDATE and DELETE only)

Enter a clause to filter the data to be updated or deleted during the update or delete operations.

This field is available when Update or Delete is selected from the Action on data drop-down list and Use WHERE conditions table is not selected.

Use WHERE conditions table

Select this option to invoke the Where conditions table table, where you can set filter conditions by providing values in the Column, Function, and Value fields of the Where conditions table.

This option is available when Update or Delete is selected from the Action on data drop-down list.

Where conditions table

Add rows and enter conditions as prompted to filter the data to be updated or deleted during the update or delete operation. A data record is selected for the update or delete operation only when it matches all the conditions set in this table.

In addition to the commonly used operators (such as =, >=, <>, and so on), the Function column of this table also provides the following operators:
  • BETWEEN, which filters the data using an interval given in the Value field. The interval needs to be in the format of <start> and <end>, for example: 10 and 20.
  • LIKE, which filters the data using a string given in the Value field. The string needs to be in the format of sample_string%.
  • IN, which filters the data using the elements in a set given in the Value field. The set needs to be in the format of (element1, element2, element3, ...), for example: (1,2,5).

This option is available when the Use WHERE conditions table option is selected.

Use Merge Update (for MERGE)

Select this check box to update the data in the output table.

Column : Lists the columns in the entry flow.

Update : Select the check box which corresponds to the name of the column you want to update.

Use Merge Update Where Clause : Select this check box and enter the WHERE clause required to filter the data to be updated, if necessary.

Use Merge Update Delete Clause: Select this check box and enter the WHERE clause required to filter the data to be deleted and updated, if necessary.

Use Merge Insert (for MERGE)

Select this check box to insert the data in the table.

Column: Lists the entry flow columns.

Check All: Select the check box corresponding to the name of the column you want to insert.

Use Merge Update Where Clause: Select this check box and enter the WHERE clause required to filter the data to be inserted.

Default Table Name

Enter a default name for the table, between double quotation marks.

Default Schema Name

Enter a name for the default Oracle schema, between double quotation marks.

Table name from connection name is variable

Select this check box when the name of the connection to this component is set to a variable, such as a context variable.

Use different table name

Select this check box to define a different output table name, between double quotation marks, in the Table name field which appears.

Mapping

Specify the metadata mapping file for the database to be used. The metadata mapping file is used for the data type conversion between database and Java. For more information about the metadata mapping, see the related documentation for Type mapping.
Note: You can use Hive mapping to support Databricks Delta Lake.

Advanced settings

Use Hint Options

Select this check box to activate the hint configuration area when you want to use a hint to optimize a query's execution. In this area, parameters are:

- HINT: specify the hint you need, using the syntax /*+ */.

- POSITION: specify where you put the hint in a SQL statement.

- SQL STMT: select the SQL statement you need to use.

tStatCatcher Statistics

Select this check box to gather the Job processing metadata at a Job level as well as at each component level.

Global Variables

Global Variables

NB_LINE: the number of rows read by an input component or transferred to an output component. This is an After variable and it returns an integer.

NB_LINE_INSERTED: the number of rows inserted. This is an After variable and it returns an integer.

ERROR_MESSAGE: the error message generated by the component when an error occurs. This is an After variable and it returns a string. This variable functions only if the Die on error check box is cleared, if the component has this check box.

QUERY: the query statement populated by the ELT Map component the component connects to. This is an After variable and it returns a string.
Note: This variable is available only when you have installed the R2022-01 Talend Studio Monthly update or a later one delivered by Talend. For more information, check with your administrator.

A Flow variable functions during the execution of a component while an After variable functions after the execution of the component.

To fill up a field or expression with a variable, press Ctrl+Space to access the variable list and choose the variable to use from it.

For more information about variables, see Using contexts and variables.

Usage

Usage rule

tELTOracleOutput is to be used along with the tELTOracleInput and tELTOracleMap components. Note that the Output link to be used with these components must correspond strictly to the syntax of the table name.

Note:

Note that the ELT components do not handle actual data flow but only schema information.