Write the sample data to Azure Data Lake Storage - 7.1

Azure Data Lake Store

author
Talend Documentation Team
EnrichVersion
7.1
EnrichProdName
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Real-Time Big Data Platform
task
Data Governance > Third-party systems > Cloud storages > Azure components > Azure Data Lake Store components
Data Quality and Preparation > Third-party systems > Cloud storages > Azure components > Azure Data Lake Store components
Design and Development > Third-party systems > Cloud storages > Azure components > Azure Data Lake Store components
EnrichPlatform
Talend Studio

Procedure

  1. Double-click the tFixedFlowIput component to open its Component view.

    Example

  2. Click the [...] button next to Edit schema to open the schema editor.
  3. Click the [+] button to add the schema columns as shown in this image.

    Example

  4. Click OK to validate these changes and accept the propagation prompted by the pop-up dialog box.
  5. In the Mode area, select the Use Inline Content radio button and paste the previously mentioned sample data into the Content field that is displayed.
  6. In the Field separator field, enter a semicolon (;).
  7. Double-click the tFileOutputParquet component to open its Component view.

    Example

  8. Select the Define a storage configuration component check box and then select the tAzureFSConfiguration component you configured in the previous steps.
  9. Click Sync columns to ensure that tFileOutputParquet has the same schema as tFixedFlowInput.
  10. In the Folder/File field, enter the name of the Data Lake storage folder to be used to store the sample data.
  11. From the Action drop-down list, select Create if the folder to be used does not exist yet on Azure Data Lake Storage; if this folder already exists, select Overwrite.