Reading the sample data from Azure Data Lake Storage - 7.1

Azure Data Lake Store

author
Talend Documentation Team
EnrichVersion
7.1
EnrichProdName
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Real-Time Big Data Platform
task
Data Governance > Third-party systems > Cloud storages > Azure components > Azure Data Lake Store components
Data Quality and Preparation > Third-party systems > Cloud storages > Azure components > Azure Data Lake Store components
Design and Development > Third-party systems > Cloud storages > Azure components > Azure Data Lake Store components
EnrichPlatform
Talend Studio

Procedure

  1. Double-click tFileInputParquet to open its Component view.

    Example

  2. Select the Define a storage configuration component check box and then select the tAzureFSConfiguration component you configured in the previous steps.
  3. Click the [...] button next to Edit schema to open the schema editor.
  4. Click the [+] button to add the schema columns for output as shown in this image.

    Example

  5. Click OK to validate these changes and accept the propagation prompted by the pop-up dialog box.
  6. In the Folder/File field, enter the name of the folder from which you need to read data. In this scenario, it is sample_user.
  7. Double click tLogRow to open its Component view and select the Table radio button to present the result in a table.
  8. Press F6 to run this Job.

Results

Once done, you can find your Job on the Job page on the Web UI of your Databricks cluster and then check the execution log of your Job.