Skip to main content Skip to complementary content

Talend Studio features

This section provides a list of core features that are installed by default and a list of optional features that need to be installed using the Feature Manager.

Core features installed by default

Feature Description
Creating a Job The Data Integration Job editor is the workspace where you can design your Jobs.
Working with components A component is a functional element that performs a single data integration operation in a Job. Only some basic Data Integration components are installed by default.
Using contexts and variables This feature allows you to manage Jobs and Routes differently for various execution types, for example, testing and production environment.
What are routines A routine is a Java class with many functions. It is generally used to factorize code.
Introducing Talend SQL templates Talend Studio provides a range of SQL templates to simplify the most common data query and update, schema creation and modification, and data access control tasks. It also comprises a SQL editor which allows you to customize or design your own SQL templates to meet less common requirements.
Objectives The metadata wizard allows you to store reusable information on databases, files, and/or systems in the Repository tree view. The information can be reused later to set the connection parameters of the relevant input or output components and the data schema in a centralized manner.
Creating a project This feature allows you to work on remote projects stored in Git repositories.
Configuring remote execution
This feature allows you to deploy and execute your Jobs on a remote JobServer when you work on either a local project or on a remote one on the condition that you are connected with Talend Administration Center.
Connecting Talend Studio to Talend Cloud
This feature allows you to set up a connection to Talend Cloud.
Opening a remote project
This feature allows you to set up a connection to Talend Administration Center.
Publishing to Talend Cloud
This feature allows you to publish Jobs, Routes and Data Services (artifacts) created in Talend Studio to Talend Cloud and make them available to specific or all users of Talend Management Console.

Optional features installed using Feature Manager

Category Feature Description
Shared features Talend Project Audit User Guide Talend Project Audit transforms project data flows to valuable business information. It introduces an auditing approach for evaluating various aspects of Jobs implemented in your Talend Studio.
Shared features
Building a Job as a Docker image
This feature allows you to build a Job as a Docker image in order to execute it on a Docker engine.
Shared features Analyzing repository items This feature provides advanced capabilities for analyzing any given item, such as a Job, in the Repository tree view.
  • Impact Analysis: discovers descendant items up to the target component.
  • Data Lineage: discovers the ancestor items starting with the source component.
Shared features
Talend Job Script Reference Guide
In addition to the graphical Job design interface, a Job script is another way to create a data integration process with Talend Studio.
Shared features Creating a Job from a template This feature allows you to use the different templates to create ready-to-run Jobs.
Shared features Metadata Bridge preferences (Talend > Import/Export) Talend Metadata Bridge accelerates the implementation, maintenance and continuous improvement of integration scenarios by allowing the synchronization, the sharing and the conversion of metadata across the different components.
Shared features Importing metadata from a CSV file This feature allows you to import metadata from a CSV file on an external application.
Shared features Publishing to an artifact repository This feature allows you to publish your Job, Route or Service into an artifact repository.
Shared features Using resources in Jobs This feature allows you to create resources and use them in your Jobs for file handling. This way, when exporting your Jobs, for example, you can pack the resource files as Job dependencies and deploy your Jobs without having to copy the files to the target system.
Shared features Talend Activity Monitoring Console User Guide Talend Activity Monitoring Console is an add-on tool integrated in Talend Studio for monitoring Talend Jobs and projects.
Shared features Testing Jobs and Services using test cases This feature allows you to create test cases to test your Jobs and Services during Continuous Integration development to make sure they will function as expected when they are actually executed to handle large datasets.
Shared features
Centralizing a Validation Rule
A validation rule is a basic or integrity rule that you can apply to metadata items to check the validity of your data. It can be basic check for correct values or referential integrity check, both applicable to database tables or individual columns, file metadata or any relevant metadata item.
Data Integration > Components Amazon DocumentDB This feature installs Amazon DocumentDB components, including tAmazonDocumentDBConnection, tAmazonDocumentDBInput, tAmazonDocumentDBOutput, and tAmazonDocumentDBClose.
Data Integration > Components CosmosDB SQLAPI This feature installs Azure Cosmos DB SQL API components, including tCosmosDBSQLAPIInput and tCosmosDBSQLAPIOutput.
Data Integration > Components Azure Data Lake Storage Gen2 This feature installs Azure ADLS Gen2 components, including tAzureADLSGen2Input and tAzureADLSGen2Output.
Data Integration > Components Azure Storage This feature installs Azure storage components, including tAzureStorageQueueCreate, tAzureStorageQueueDelete, tAzureStorageQueueInput, tAzureStorageQueueInputLoop, tAzureStorageQueueList, tAzureStorageQueueOPutput, tAzureStorageQueuePurge, tAzureStorageConnection, tAzureStorageContainerCreate, tAzureStorageContainerDelete, tAzureStorageContainerList, tAzureStorageDelete, tAzureStorageGet, tAzureStorageList, tAzureStoragePut, tAzureStorageInputTable, tAzureStorageOutputTable
Data Integration > Components BRMS and Rules This feature installs BRMS/rules components, including tBRMS and tRules.
Data Integration > Components
Couchbase
This feature installs Couchbase components, including tCouchbaseDCInput, tCouchbaseDCOutput, tCouchbaseInput, tCouchbaseOutput.
Data Integration > Components CyberArk This feature installs CyberArk components, including tCyberarkInput.
Data Integration > Components tESBConsumer This feature installs the tESBConsumer component.
Data Integration > Components Google Drive This feature installs Google drive components, including tGoogleDriveConnection, tGoogleDriveCopy, tGoogleDriveCreate, tGoogleDriveDelete, tGoogleDriveGet, tGoogleDriveList, tGoogleDrivePut.
Data Integration > Components Google Bigtable This feature installs Google Bigtable components, including tBigtableConnection, tBigtableInput, tBigtableOutput, and tBigtableClose.
Data Integration > Components JIRA This feature installs Jira components, including tJIRAInput and tJIRAOutput.
Data Integration > Components Marketo This feature installs Marketo components, including tMarketoBulkExec, tMarketoCampain, tMarketoConnection, tMarketoInput, tMarketoListOperation, tMarketoOutput.
Data Integration > Components
MarkLogic
This feature installs MarkLogic components, including tMarkLogicBulkLoad, tMarkLogicClose, tMarkLogicConnection, tMarkLogicInput, tMarkLogicOutput.
Data Integration > Components Neo4j This feature installs Neo4j components, including tNeo4JClose, tNeo4JConnection, tNeo4JInput, tNeo4JOutput, tNeo4JRow, tNeo4JBatchOutput, tNeo4JBatchOutputRelationship, and tNeo4JBatchSchema.
Data Integration > Components
NetSuite
This feature installs NetSuite components, including tNetsuiteConnection, tNetsuiteInput, tNetsuiteOutput.
Data Integration > Components
NoSQL / Big Data
This feature installs the NoSQL / Big Data components, including Cassandra, CosmosDB, DBFS, DynamoDB, ELTHive, HBase, HCatalog, HDFS, Hive, Iceberg, Impala, Kafka, MapRDB, MongoDB, Neo4j, SAP HANA, and Sqoop related components.
Data Integration > Components
Partitioner
This feature installs the Partitioner components, including tCollector, tDepartitoner, tPartitioner, and tRecollector.
Data Integration > Components RabbitMQ This feature installs RabbitMQ components, including tRabbitMQInput, tRabbitMQOutput, tRabbitMQClose, tRabbitMQConnection.
Data Integration > Components tRESTClient This feature installs the tRESTClient component.
Data Integration > Components Salesforce This feature installs Salesforce components, including tSalesforceBulkExec, tSalesforceConnection, tSalesforceEinsteinBulkExec, tSalesforceEinsteinOutputBulkExec, tSalesforceGetDeleted, tSalesforceGetServerTimestamp, tSalesforceGetUpdated, tSalesforceInput, tSalesforceOutput, tSalesforceOutputBulk, tSalesforceOutputBulkExec.
Data Integration > Components
SAP Hana (Advanced)
This feature installs the advanced SAP HANA components, including tSAPHanaBulkExec and tSAPHanaUnload.
Data Integration > Components Snowflake This feature installs Snowflake components, including tSnowflakeBulkExec, tSnowflakeClose, tSnowflakeCommit, tSnowflakeConnection, tSnowflakeInput, tSnowflakeOutput, tSnowflakeOutputBulk, tSnowflakeOutputBulkExec, tSnowflakeRollback, tSnowflakeRow.
Data Integration > Components Splunk This feature installs Splunk components, including tSplunkEventCollector.
Data Integration > Components
Talend Data Preparation

The Talend Data Preparation components apply preparations, create datasets in Talend Data Preparation or create flows with data from Talend Data Preparation datasets.

Data Integration > Components
Talend Data Stewardship

The Talend Data Stewardship components load data into Talend Data Stewardship campaigns and retrieve or delete data in the form of tasks in Talend Data Stewardshipcampaigns.

Data Integration > Components Workday This feature installs Workday components, including tWorkdayInput.
Data Integration > Components Zendesk This feature installs Zendesk components, including tZendeskInput and tZendeskOutput.
Data Integration > Metadata Setting up an advanced schema This feature helps you define an Advanced WebService schema and store it in the Repository tree view.
Data Integration > Metadata CDC architectural overview This feature helps you set up a CDC environment on a dedicated database connection, which can quickly identify and capture data that has been added to, updated in, or removed from database tables and make this change data available for future use by applications or individuals. It is available for Oracle, MySQL, DB2, PostgreSQL, Sybase, MS SQL Server, Informix, Ingres, Teradata, and AS/400.
Data Integration > Metadata Centralizing UN/EDIFACT metadata The EDIFACT metadata wizard helps you create a schema to be used for the tExtractEDIField component to read and extract data from UN/EDIFACT message files.
Data Integration > Metadata
Centralizing a Hadoop connection
This features enables you to create and store a connection to a Hadoop cluster in the Repository tree view.
Data Integration > Metadata
Centralizing HBase metadata
This features enables you to centralize and store the connection information to a HBase database in the Repository tree view.
Data Integration > Metadata
Centralizing HCatalog metadata
This features enables you to centralize and store the connection information to a HCatalog table in the Repository tree view.
Data Integration > Metadata Centralizing SAP metadata The SAP metadata wizard helps you create a connection to an SAP BW system and an SAP HANA database and store this connection in the Repository tree view.
Data Integration > Metadata
Centralizing MDM metadata (deprecated)
Talend MDM metadata wizard helps you centralize the details of one or more MDM connections in the Repository tree view.
Data Quality
Data profiling capacities and Standard components
This feature lets you use:
  • The Profiling perspective: Use predefined or customized patterns and indicators to analyze data stored in different data sources.
  • The Data Explorer perspective: Browse and query the results of the profiling analyses done on data.
  • The Standard Components for Data Quality
Data Quality
Spark Batch components and Big Data dependencies
This feature lets you use the Spark Batch Components for Data Quality. Big Data features are also installed to make the components work properly.
Data Quality
Spark Streaming components and Big Data dependencies
This feature lets you use the Spark Streaming Components for Data Quality. Big Data features are also installed to make the components work properly.
Application Integration
What is a Service and What is a Route
Data Service combines data integration with Web services and enables the graphical design of a Service which includes a WSDL file and one or more Jobs that addresses all of the different sources and targets required to publish the Web service. Route defines how messages will be moved from one service (or endpoint) to another.
Big Data
Spark Batch
This feature enables you to create Spark Batch Jobs.
Big Data
Spark Streaming
This feature enables you to create Spark Streaming Jobs.
Big Data > Distributions Amazon EMR 5.29.0 This feature enables you to run your Spark Jobs on the Amazon EMR 5.29.0 distribution.
Big Data > Distributions Amazon EMR 6.2.0 This feature enables you to run your Spark Jobs on the Amazon EMR 6.2.0 distribution.
Big Data > Distributions Azure Synapse This feature enables you to run your Spark Jobs on Azure Synapse Analytics with Apache Spark pools as a distribution.
Big Data > Distributions Cloudera CDH Dynamic Distribution This feature enables you to run your Spark Jobs on Cloudera CDH using either Static (CDH 6.1, CDH 6.2 and CDH 6.3) or Dynamic distributions.
Big Data > Distributions Cloudera Data Platform Dynamic Distribution This feature enables you to run your Spark Jobs on Cloudera Data Platform using either Static (CDP 7.1) or Dynamic distributions.
Big Data > Distributions Databricks 5.5 This feature enables you to run your Spark Jobs on the Databricks 5.5 distribution.
Big Data > Distributions Databricks 6.4 This feature enables you to run your Spark Jobs on the Databricks 6.4 distribution.
Big Data > Distributions Databricks 7.3 LTS This feature enables you to run your Spark Jobs on the Databricks 7.3 LTS distribution.
Big Data > Distributions Hortonworks HDP Dynamic Distribution This feature enables you to run your Spark Jobs on Hortonworks HDP using either Static or Dynamic distributions.
Big Data > Distributions Microsoft Azure HDInsight 4.0 This feature enables you to run your Spark Jobs on the Microsoft Azure HDInsight 4.0 distribution.
Big Data > Universal Distribution (Recommended) Universal Distribution (Spark 2.4.x) This feature enables you to run your Spark Jobs on Universal distribution with Spark 2.4.x.
Big Data > Universal Distribution (Recommended) Universal Distribution (Spark 3.0.x) This feature enables you to run your Spark Jobs on Universal distribution with Spark 3.0.x.
Big Data > Universal Distribution (Recommended) Universal Distribution (Spark 3.1.x) This feature enables you to run your Spark Jobs on Universal distribution with Spark 3.1.x.
Big Data > Universal Distribution (Recommended) Universal Distribution (Spark 3.2.x) This feature enables you to run your Spark Jobs on Universal distribution with Spark 3.2.x.
Big Data > Universal Distribution (Recommended) Universal Distribution (Spark 3.3.x) This feature enables you to run your Spark Jobs on Universal distribution with Spark 3.3.x.
Big Data > Universal Distribution (Recommended) Universal Distribution (Spark 3.2.x) This feature enables you to run your Spark Jobs on Universal distribution with Spark 3.2.x.
Big Data > Universal Distribution (Recommended) Universal Distribution (Spark 3.3.x) This feature enables you to run your Spark Jobs on Universal distribution with Spark 3.3.x.
Data Mapper
Talend Data Mapper User Guide
Talend Data Mapper allows you to map complex data records and documents and execute transformations in Data Integration Jobs and Routes.
Data Mapper
Talend Data Mapper User Guide
Talend Data Mapper for Spark allows you to map complex data records and documents and execute transformations in Big Data Jobs.
Data Mapper > Standard structures
HL7 v2.1
This feature allows you to use Talend Data Mapper with data in the HL7 v2.1 standard.
Data Mapper > Standard structures
HL7 v2.2
This feature allows you to use Talend Data Mapper with data in the HL7 v2.2 standard.
Data Mapper > Standard structures
HL7 v2.3
This feature allows you to use Talend Data Mapper with data in the HL7 v2.3 standard.
Data Mapper > Standard structures
HL7 v2.3.1
This feature allows you to use Talend Data Mapper with data in the HL7 v2.3.1 standard.
Data Mapper > Standard structures
HL7 v2.4
This feature allows you to use Talend Data Mapper with data in the HL7 v2.4 standard.
Data Mapper > Standard structures
HL7 v2.5
This feature allows you to use Talend Data Mapper with data in the HL7 v2.5 standard.
Data Mapper > Standard structures
HL7 v2.5.1
This feature allows you to use Talend Data Mapper with data in the HL7 v2.5.1 standard.
Data Mapper > Standard structures
HL7 v2.6
This feature allows you to use Talend Data Mapper with data in the HL7 v2.6 standard.
Data Mapper > Standard structures
HL7 v2.7
This feature allows you to use Talend Data Mapper with data in the HL7 v2.7 standard.
Data Mapper > Standard structures
HL7 v2.7.1
This feature allows you to use Talend Data Mapper with data in the HL7 v2.7.1 standard.
Data Mapper > Standard structures
HL7 v2.8
This feature allows you to use Talend Data Mapper with data in the HL7 v2.8 standard.
Data Mapper > Standard structures
HL7 v2.8.1
This feature allows you to use Talend Data Mapper with data in the HL7 v2.8.1 standard.
Data Mapper > Standard structures
X12 4010 HIPAA
This feature allows you to use Talend Data Mapper with data in the X12 4010 HIPAA standard.
Data Mapper > Standard structures
Talend Data Mapper - X12 HIPAA Specifications
This feature allows you to use Talend Data Mapper with data in the X12 5010 HIPAA standard.
Master Data Management
Master Data Management by Talend
Talend MDM addresses the challenges of creating and managing master data for all types of organizations where data is hosted under various formats in various systems. It groups all master data of the company in a central hub and has all the core features a user needs for an MDM application: advanced modeling, model-driven dynamic web interface, full-text search, event triggering, role-based security, etc.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!