Setting up the Job - 7.3

Azure Data Lake Store

Version
7.3
Language
English
Product
Talend Big Data
Talend Big Data Platform
Talend Data Fabric
Talend Data Integration
Talend Data Management Platform
Talend Data Services Platform
Talend ESB
Talend MDM Platform
Talend Real-Time Big Data Platform
Module
Talend Studio
Content
Data Governance > Third-party systems > Cloud storages > Azure components > Azure Data Lake Storage Gen2 components
Data Quality and Preparation > Third-party systems > Cloud storages > Azure components > Azure Data Lake Storage Gen2 components
Design and Development > Third-party systems > Cloud storages > Azure components > Azure Data Lake Storage Gen2 components
Last publication date
2023-06-12

Procedure

  1. Double-click the tFixedFlowInput component to open the Basic settings view.
    1. Click the […] button next to Edit schema and add four columns in the schema editor: ID, type Integer; Name, type String; Address, type String, and Date, type Date and format of dd-MM-yyyy.
    2. Click Yes when prompted to propagated the changes.
      This propagates the schema to the tAzureADLSGen2Input component.
    3. Select the Use Inline Content (delimited file) option.
    4. Enter the following in the Content field.
      1;Bill's Dive Shop;511 Maple Ave;07-07-2020
      2;Facelift Kitchen and Bath;220 Vine Ave.;04-23-2019
      3;Kermit the Pet Shop;1860 Parkside Ln.;05-29-2020
      4;Nirabi Auto Service;1915 Lewis Ln. Apt 13;06-19-2018
      5;Darcy Frame and Matting Servic;1633 McGovern place;01-07-2020
      6;Gourmet the Frog;788 Tennyson Ave.;01-31-2019
      							
    5. Leave other options as they are.
  2. In the Basic settings view of the tAzureADLSGen2Output component:
    1. Check the schema and make sure the schema has the same columns as those for the tFixedFlowInput component.
    2. Select Shared Access Signature from the Authentication method drop-down list.
    3. Enter your account name and the SAS token string in the Account name and SAS token fields.
    4. Click the [...] button to the right of the Filesystem field and select the desired file system.
    5. Enter the desired Blobs path in the Blobs Path field.
    6. Select CSV from the Format drop-down list.
    7. Leave the other options as they are.
  3. In the Basic settings view of the tAzureADLSGen2Input component:
    1. Click the […] button next to Edit schema and add the same four columns as those for the tFixedFlowInput component in the schema editor.
    2. Click Yes when prompted to propagated the changes.
      This propagates the schema to the tLogRow component.
    3. Perform the same configuration as that of the tAzureADLSGen2Output component.
  4. In the Basic settings view of the tLogRow component:
    1. Check the schema and make sure the schema has the same columns as those for the tAzureADLSGen2Input component.
    2. Select the Table (print values in cells of a table) option.
    3. Leave other options as they are.
  5. Save the Job.