Write the sample data to Azure Data Lake Storage - 7.3
Azure Data Lake Store
- EnrichVersion
- Cloud
- 7.3
- EnrichProdName
- Talend Big Data
- Talend Big Data Platform
- Talend Data Fabric
- Talend Real-Time Big Data Platform
- EnrichPlatform
- Talend Studio
- task
- Data Governance > Third-party systems > Cloud storages > Azure components > Azure Data Lake Store components
- Data Quality and Preparation > Third-party systems > Cloud storages > Azure components > Azure Data Lake Store components
- Design and Development > Third-party systems > Cloud storages > Azure components > Azure Data Lake Store components
Procedure
-
Double-click the tFixedFlowIput component to
open its Component view.
Example
-
Click the [...] button next to Edit schema to open the schema editor.
-
Click the [+] button to add the schema
columns as shown in this image.
Example
-
Click OK to validate these changes and accept
the propagation prompted by the pop-up dialog box.
-
In the Mode area, select the Use Inline
Content radio button and paste the previously mentioned sample data
into the Content field that is displayed.
-
In the Field separator field, enter a
semicolon (;).
-
Double-click the tFileOutputParquet component to
open its Component view.
Example
-
Select the Define a storage configuration component
check box and then select the tAzureFSConfiguration
component you configured in the previous steps.
-
Click Sync columns to ensure that
tFileOutputParquet has the same schema as
tFixedFlowInput.
-
In the Folder/File field, enter the name of the Data
Lake storage folder to be used to store the sample data.
-
From the Action drop-down list, select
Create if the folder to be used does not exist yet on
Azure Data Lake Storage; if this folder already exists, select Overwrite.