Using HDFS components to work with Azure Data Lake Store (ADLS)
This scenario describes how to use the HDFS components to read data from and write data to Azure Data Lake Store.
This scenario applies only to a Talend solution with Big Data.
tFixedFlowInput: it provides sample data to the Job.
tLibraryLoad: it loads required libraries to the Job.
tHDFSOutput: it writes sample data to Azure Data Lake Store.
tHDFSInput: it reads sample data from Azure Data Lake Store.
tLogRow: it displays the output of the Job on the console of the Run view of the Job.
Configuring your Azure Data Lake Store
An Azure subscription is required.