Creating an analysis on an HDFS file - 6.3

Talend Data Fabric Studio User Guide

EnrichVersion
6.3
EnrichProdName
Talend Data Fabric
task
Data Quality and Preparation
Design and Development
EnrichPlatform
Talend Studio

Talend Studio enables easy profiling of HDFS files by generating tables on a Hive connection.

To create a column analysis with simple statistics indicators on an HDFS file, proceed as the following:

Creating a connection to a Hadoop cluster

Prerequisite(s): You have opened the Profiling perspective of the Studio. You have the proper access permission to the Hadoop distribution and its HDFS.

  1. In the DQ Repository tree view, expand Metadata, right-click Hadoop Cluster and select Create Hadoop Cluster.

    A wizard opens to guide you through the steps to create a connection to the cluster.

  2. Follow the steps in the wizard to create the connection, and select to manually enter the Hadoop configuration information.

    For detail information about creating connections to Hadoop clusters, see Managing Hadoop metadata.

  3. Click Check Services at the last step in the wizard to verify if the connection is successful and then click Finish.

    The new Hadoop connection is listed under the Hadoop Cluster node in the DQ Repository tree view.

Creating a connection to Hive

You can create a connection to Hive directly from the connection you defined for the Hadoop distribution. However, you can proceed differently and create the connection to Hive simultaneously while you create an analysis on HDFS file as outlined in Creating a connection to an HDFS file.

Prerequisite(s): You have opened the Profiling perspective of the Studio. You have created a connection to the Hadoop distribution.

  1. In the DQ Repository tree view, right-click the Hadoop connection to be used and select Create Hive to open a wizard.

  2. Follow the steps in the wizard to create the connection, and click Check at the last step to verify if the connection is successful.

  3. Click Finish.

    The new Hive connection is listed under the Hadoop Cluster and the DB connections nodes in the DQ Repository tree view.

    For detail information about creating Hive connections, see Centralizing Hive metadata.

Creating a connection to an HDFS file

Prerequisite(s): You have opened the Profiling perspective of the Studio. You have created a connection to the Hadoop distribution.

  1. In the DQ Repository tree view, right-click the Hadoop connection to be used and select Create HDFS.

    A wizard opens to guide you through the steps to use a file schema from HDFS.

  2. Follow the steps in the wizard to create the connection, and click Check at the last step to verify if the connection is successful.

  3. Click Finish.

    The new HDFS connection is listed under the Hadoop connection in the DQ Repository tree view.

    For detail information about creating HDFS connections, see Centralizing HDFS metadata.

Creating a profiling analysis on the HDFS file via a Hive table

Prerequisite(s): You have opened the Profiling perspective of the Studio. You have created a connection to the Hadoop distribution and the HDFS file.

  1. In the DQ Repository tree view, right-click the HDFS connection to be used and select Create Simple Analysis.

    A dialog box opens listing the HDFS schemas in the connection.

  2. Select the check box of the file you want to profile.

    Wait till you read Success in the Creation status column.

    Note

    The Hive table you will create is based on folders and not on files. So you must not select files that have different structures.

  3. Click Check Connection to verify the connection status and then click Next to open a new view in the wizard which lists the schema of the selected file.

  4. Modify the schema if needed.

    If there is a Date column in the schema, make sure to correctly set the date pattern, otherwise you may get null as results.

  5. Click Next to open a new view in the wizard where you can create a table with the HDFS schema on a Hive connection.

    If needed, enter a new name for the table. Use lower case as Hive stores tables in lower case.

  6. Either:

    • From the Select one existed Hive Connection list, select the Hive connection on which you want to create the table.

      You must have at least one Hive connection correctly configured before you can create the table. The Select one existed Hive Connection option will be disabled if you have not created at least one Hive connection.

      You can create a Hive connection if you select the Create a new Hive Connection option in this view of the wizard.

    • Select the Create a new Hive Connection option to create first a Hive connection, then to create the table on the new connection.

  7. Click Finish.

    The [New Analysis] wizard opens. Be patient, this may take some time.

  8. Set the analysis metadata and click Finish.

    A new analysis on the selected HDFS file is automatically created and opened in the analysis editor. Simple statistics indicators are automatically assigned for columns.

    The analysis actually applies to the Hive table, but computes statistics on the data from the HDFS by using the External table mechanism. External tables keep data in the original file outside of Hive. If the HDFS file you selected to analyze is deleted, then the analysis will not be able to run anymore.

  9. Click Refresh Data to display the column content.

    You can use the Select Columns tab to modify the columns to be analyzed.

  10. If needed, click Select Indicators to add more indicators or new patterns to the columns.

  11. Run the analysis to display the results in the Analysis Results view in the editor.

    For further information on column analysis, see Column analyses.