Validate.rows_distinct

Validate.rows_distinct(
    columns_subset=None,
    pre=None,
    thresholds=None,
    actions=None,
    brief=None,
    active=True,
)

Validate whether rows in the table are distinct.

The rows_distinct() method checks whether rows in the table are distinct. This validation will operate over the number of test units that is equal to the number of rows in the table (determined after any pre= mutation has been applied).

Parameters

columns_subset : str | list[str] | None = None

A single column or a list of columns to use as a subset for the distinct comparison. If None, then all columns in the table will be used for the comparison. If multiple columns are supplied, the distinct comparison will be made over the combination of values in those columns.

pre : Callable | None = None

A optional preprocessing function or lambda to apply to the data table during interrogation. This function should take a table as input and return a modified table. Have a look at the Preprocessing section for more information on how to use this argument.

thresholds : int | float | bool | tuple | dict | Thresholds = None

Set threshold failure levels for reporting and reacting to exceedences of the levels. The thresholds are set at the step level and will override any global thresholds set in Validate(thresholds=...). The default is None, which means that no thresholds will be set locally and global thresholds (if any) will take effect. Look at the Thresholds section for information on how to set threshold levels.

actions : Actions | None = None

Optional actions to take when the validation step meets or exceeds any set threshold levels. If provided, the Actions class should be used to define the actions.

brief : str | bool | None = None

An optional brief description of the validation step that will be displayed in the reporting table. You can use the templating elements like "{step}" to insert the step number, or "{auto}" to include an automatically generated brief. If True the entire brief will be automatically generated. If None (the default) then there won’t be a brief.

active : bool = True

A boolean value indicating whether the validation step should be active. Using False will make the validation step inactive (still reporting its presence and keeping indexes for the steps unchanged).

Returns

: Validate

The Validate object with the added validation step.

Preprocessing

The pre= argument allows for a preprocessing function or lambda to be applied to the data table during interrogation. This function should take a table as input and return a modified table. This is useful for performing any necessary transformations or filtering on the data before the validation step is applied.

The preprocessing function can be any callable that takes a table as input and returns a modified table. For example, you could use a lambda function to filter the table based on certain criteria or to apply a transformation to the data. Note that you can refer to columns via columns_subset= that are expected to be present in the transformed table, but may not exist in the table before preprocessing. Regarding the lifetime of the transformed table, it only exists during the validation step and is not stored in the Validate object or used in subsequent validation steps.

Thresholds

The thresholds= parameter is used to set the failure-condition levels for the validation step. If they are set here at the step level, these thresholds will override any thresholds set at the global level in Validate(thresholds=...).

There are three threshold levels: ‘warning’, ‘error’, and ‘critical’. The threshold values can either be set as a proportion failing of all test units (a value between 0 to 1), or, the absolute number of failing test units (as integer that’s 1 or greater).

Thresholds can be defined using one of these input schemes:

  1. use the Thresholds class (the most direct way to create thresholds)
  2. provide a tuple of 1-3 values, where position 0 is the ‘warning’ level, position 1 is the ‘error’ level, and position 2 is the ‘critical’ level
  3. create a dictionary of 1-3 value entries; the valid keys: are ‘warning’, ‘error’, and ‘critical’
  4. a single integer/float value denoting absolute number or fraction of failing test units for the ‘warning’ level only

If the number of failing test units exceeds set thresholds, the validation step will be marked as ‘warning’, ‘error’, or ‘critical’. All of the threshold levels don’t need to be set, you’re free to set any combination of them.

Aside from reporting failure conditions, thresholds can be used to determine the actions to take for each level of failure (using the actions= parameter).

Examples

For the examples here, we’ll use a simple Polars DataFrame with three string columns (col_1, col_2, and col_3). The table is shown below:

import pointblank as pb
import polars as pl

tbl = pl.DataFrame(
    {
        "col_1": ["a", "b", "c", "d"],
        "col_2": ["a", "a", "c", "d"],
        "col_3": ["a", "a", "d", "e"],
    }
)

pb.preview(tbl)
col_1
String
col_2
String
col_3
String
1 a a a
2 b a a
3 c c d
4 d d e

Let’s validate that the rows in the table are distinct with rows_distinct(). We’ll determine if this validation had any failing test units (there are four test units, one for each row). A failing test units means that a given row is not distinct from every other row.

validation = (
    pb.Validate(data=tbl)
    .rows_distinct()
    .interrogate()
)

validation
STEP COLUMNS VALUES TBL EVAL UNITS PASS FAIL W E C EXT
#4CA64C 1
rows_distinct
rows_distinct()
ALL COLUMNS 4 4
1.00
0
0.00

From this validation table we see that there are no failing test units. All rows in the table are distinct from one another.

We can also use a subset of columns to determine distinctness. Let’s specify the subset using columns col_2 and col_3 for the next validation.

validation = (
    pb.Validate(data=tbl)
    .rows_distinct(columns_subset=["col_2", "col_3"])
    .interrogate()
)

validation
STEP COLUMNS VALUES TBL EVAL UNITS PASS FAIL W E C EXT
#4CA64C66 1
rows_distinct
rows_distinct()
col_2, col_3 4 2
0.50
2
0.50

The validation table reports two failing test units. The first and second rows are duplicated when considering only the values in columns col_2 and col_3. There’s only one set of duplicates but there are two failing test units since each row is compared to all others.