Using HDFS components to work with Azure Data Lake Storage (ADLS) - 7.3

HDFS

author
Talend Documentation Team
EnrichVersion
Cloud
7.3
EnrichProdName
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Open Studio for Big Data
Talend Real-Time Big Data Platform
task
Data Governance > Third-party systems > File components (Integration) > HDFS components
Data Quality and Preparation > Third-party systems > File components (Integration) > HDFS components
Design and Development > Third-party systems > File components (Integration) > HDFS components
EnrichPlatform
Talend Studio

This scenario describes how to use the HDFS components to read data from and write data to Azure Data Lake Storage.

For more technologies supported by Talend, see Talend components.

This scenario applies only to Talend products with Big Data.

  • tFixedFlowInput: it provides sample data to the Job.

  • tHDFSOutput: it writes sample data to Azure Data Lake Store.

  • tHDFSInput: it reads sample data from Azure Data Lake Store.

  • tLogRow: it displays the output of the Job on the console of the Run view of the Job.