tELTHiveOutput properties - 6.3

Talend Components Reference Guide

EnrichVersion
6.3
EnrichProdName
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Data Integration
Talend Data Management Platform
Talend Data Services Platform
Talend ESB
Talend MDM Platform
Talend Open Studio for Big Data
Talend Open Studio for Data Integration
Talend Open Studio for Data Quality
Talend Open Studio for ESB
Talend Open Studio for MDM
Talend Real-Time Big Data Platform
task
Data Governance
Data Quality and Preparation
Design and Development
EnrichPlatform
Talend Studio

Component family

ELT/Map/Hive

 

 Basic settings

Action on data

Select the action to be performed on the data to be written in the Hive table.

With the Insert option, the data to be written in the Hive table will be appended to the existing data if there is any.

 

Schema

A schema is a row description. It defines the number of fields to be processed and passed on to the next component. The schema is either Built-in or stored remotely in the Repository.

 

 

Built-In: You create and store the schema locally for this component only. Related topic: see Talend Studio User Guide.

Repository: You have already created the schema and stored it in the Repository. You can reuse it in various projects and Job designs. Related topic: see Talend Studio User Guide.

Since version 5.6, both the Built-In mode and the Repository mode are available in any of the Talend solutions.

 

Edit schema

Click Edit schema to make changes to the schema. If the current schema is of the Repository type, three options are available:

  • View schema: choose this option to view the schema only.

  • Change to built-in property: choose this option to change the schema to Built-in for local changes.

  • Update repository connection: choose this option to change the schema stored in the repository and decide whether to propagate the changes to all the Jobs upon completion. If you just want to propagate the changes to the current Job, you can select No upon completion and choose this schema metadata again in the [Repository Content] window.

 

Default table name

Enter the default name of the output table you want to write data in.

 

Default schema name

Enter the name of the default database schema to which the output table to be used is related to.

 

Use different table name

Select this check box to define a different output table name, between double quotation marks, in the Table name field that appears.

If this table is related to a different database schema from the default one, you also need to enter the name of that database schema. The syntax is schema_name.table_name.

 

The target table uses the Parquet format

If the table in which you need to write data is a Parquet table, select this check box.

Then from the Compression list that appears, select the compression mode you need to use to handle the Parquet file. The default mode is Uncompressed.

 

Field Partition

In Partition Column, enter the name, without any quotation marks, of the partition column of the Hive table you want to write data in.

In Partition Value, enter the value you want to use, in single quotation marks, for its corresponding partition column.

 Advanced settings

tStatCatcher Statistics

Select this check box to collect log data at the component level.

Global Variables

ERROR_MESSAGE: the error message generated by the component when an error occurs. This is an After variable and it returns a string. This variable functions only if the Die on error check box is cleared, if the component has this check box.

A Flow variable functions during the execution of a component while an After variable functions after the execution of the component.

To fill up a field or expression with a variable, press Ctrl + Space to access the variable list and choose the variable to use from it.

For further information about variables, see Talend Studio User Guide.

Usage

tELTHiveMap is used along with a tELTHiveInput and tELTHiveOutput. Note that the Output link to be used with these components must correspond strictly to the syntax of the table name.

If the Studio used to connect to a Hive database is operated on Windows, you must manually create a folder called tmp in the root of the disk where this Studio is installed.

Note

The ELT components do not handle actual data flow but only schema information.