In some cases, when you analyze database tables that have some duplicate records and a join clause, using an SQL business rule, the join results show that there are more rows in the joint than in the analyzed table.
You can generate a ready-to-use analysis to analyze these duplicate records. The results of this analysis help you to better understand why there are more records in the join results than in the table.
Before you begin
A table analysis with an SQL business rule, that has a join condition, is defined and executed in the Profiling perspective of Talend Studio. The join results must show that there are duplicates in the table.
For more information, see Creating a table analysis with an SQL business rule with a join condition.
- After creating and executing an analysis on a table that has duplicate records as outlined in Creating a table analysis with an SQL business rule with a join condition, click the Analysis Results tab at the bottom of the analysis editor.
Right-click the join results in the second table and select
The Column Selection dialog box opens with the analyzed tables selected by default.
Modify the selection in the dialog box if needed and then click
Save the analysis and press F6 to
The analysis results show two bars, one representing the row count of the data records in the analyzed column and the other representing the duplicate count.
Click Analysis Results
at the bottom of the analysis editor to access the detail result view.
Right-click the row count or duplicate count results in the
table, or right-click the result bar in the chart itself and select:
open a view on a list of all data rows or duplicate rows in the analyzed column. View values open a view on a list of the duplicate data values of the analyzed column. Identify duplicates generate a ready-to-use Job that identifies and separates unique and duplicate records in the selected column for subsequent processing. This Job outputs all the duplicates in a reject .csv file by default, and writes the unique values in another separate file.