import pointblank as pb
import polars as pl
= pl.DataFrame(
tbl
{"a": [5, 7, 1, 3, 9, 4],
"b": [6, 3, 0, 5, 8, 2],
"c": [10, 4, 8, 9, 10, 5],
}
)
pb.preview(tbl)
Validate.conjointly
Validate.conjointly(*exprs,
=None,
pre=None,
thresholds=None,
actions=None,
brief=True,
active )
Perform multiple row-wise validations for joint validity.
The conjointly()
validation method checks whether each row in the table passes multiple validation conditions simultaneously. This enables compound validation logic where a test unit (typically a row) must satisfy all specified conditions to pass the validation.
This method accepts multiple validation expressions as callables, which should return boolean expressions when applied to the data. You can use lambdas that incorporate Polars/Pandas/Ibis expressions (based on the target table type) or create more complex validation functions. The validation will operate over the number of test units that is equal to the number of rows in the table (determined after any pre=
mutation has been applied).
Parameters
*exprs :
Callable
= ()-
Multiple validation expressions provided as callable functions. Each callable should accept a table as its single argument and return a boolean expression or Series/Column that evaluates to boolean values for each row.
pre :
Callable
| None = None-
An optional preprocessing function or lambda to apply to the data table during interrogation. This function should take a table as input and return a modified table. Have a look at the Preprocessing section for more information on how to use this argument.
thresholds :
int
|float
|bool
|tuple
|dict
| Thresholds = None-
Set threshold failure levels for reporting and reacting to exceedences of the levels. The thresholds are set at the step level and will override any global thresholds set in
Validate(thresholds=...)
. The default isNone
, which means that no thresholds will be set locally and global thresholds (if any) will take effect. Look at the Thresholds section for information on how to set threshold levels. actions : Actions | None = None
-
Optional actions to take when the validation step meets or exceeds any set threshold levels. If provided, the
Actions
class should be used to define the actions. brief :
str
|bool
| None = None-
An optional brief description of the validation step that will be displayed in the reporting table. You can use the templating elements like
"{step}"
to insert the step number, or"{auto}"
to include an automatically generated brief. IfTrue
the entire brief will be automatically generated. IfNone
(the default) then there won’t be a brief. active :
bool
= True-
A boolean value indicating whether the validation step should be active. Using
False
will make the validation step inactive (still reporting its presence and keeping indexes for the steps unchanged).
Returns
: Validate
-
The
Validate
object with the added validation step.
Preprocessing
The pre=
argument allows for a preprocessing function or lambda to be applied to the data table during interrogation. This function should take a table as input and return a modified table. This is useful for performing any necessary transformations or filtering on the data before the validation step is applied.
The preprocessing function can be any callable that takes a table as input and returns a modified table. For example, you could use a lambda function to filter the table based on certain criteria or to apply a transformation to the data. Regarding the lifetime of the transformed table, it only exists during the validation step and is not stored in the Validate
object or used in subsequent validation steps.
Thresholds
The thresholds=
parameter is used to set the failure-condition levels for the validation step. If they are set here at the step level, these thresholds will override any thresholds set at the global level in Validate(thresholds=...)
.
There are three threshold levels: ‘warning’, ‘error’, and ‘critical’. The threshold values can either be set as a proportion failing of all test units (a value between 0
to 1
), or, the absolute number of failing test units (as integer that’s 1
or greater).
Thresholds can be defined using one of these input schemes:
- use the
Thresholds
class (the most direct way to create thresholds) - provide a tuple of 1-3 values, where position
0
is the ‘warning’ level, position1
is the ‘error’ level, and position2
is the ‘critical’ level - create a dictionary of 1-3 value entries; the valid keys: are ‘warning’, ‘error’, and ‘critical’
- a single integer/float value denoting absolute number or fraction of failing test units for the ‘warning’ level only
If the number of failing test units exceeds set thresholds, the validation step will be marked as ‘warning’, ‘error’, or ‘critical’. All of the threshold levels don’t need to be set, you’re free to set any combination of them.
Aside from reporting failure conditions, thresholds can be used to determine the actions to take for each level of failure (using the actions=
parameter).
Examples
For the examples here, we’ll use a simple Polars DataFrame with three numeric columns (a
, b
, and c
). The table is shown below:
Let’s validate that the values in each row satisfy multiple conditions simultaneously:
- Column
a
should be greater than 2 - Column
b
should be less than 7 - The sum of
a
andb
should be less than the value in columnc
We’ll use conjointly()
to check all these conditions together:
= (
validation =tbl)
pb.Validate(data
.conjointly(lambda df: pl.col("a") > 2,
lambda df: pl.col("b") < 7,
lambda df: pl.col("a") + pl.col("b") < pl.col("c")
)
.interrogate()
)
validation
The validation table shows that not all rows satisfy all three conditions together. For a row to pass the conjoint validation, all three conditions must be true for that row.
We can also use preprocessing to filter the data before applying the conjoint validation:
= (
validation =tbl)
pb.Validate(data
.conjointly(lambda df: pl.col("a") > 2,
lambda df: pl.col("b") < 7,
lambda df: pl.col("a") + pl.col("b") < pl.col("c"),
=lambda df: df.filter(pl.col("c") > 5)
pre
)
.interrogate()
)
validation
This allows for more complex validation scenarios where the data is first prepared and then validated against multiple conditions simultaneously.
Or, you can use the backend-agnostic column expression helper expr_col()
to write expressions that work across different table backends:
= pl.DataFrame(
tbl
{"a": [5, 7, 1, 3, 9, 4],
"b": [6, 3, 0, 5, 8, 2],
"c": [10, 4, 8, 9, 10, 5],
}
)
# Using backend-agnostic syntax with expr_col()
= (
validation =tbl)
pb.Validate(data
.conjointly(lambda df: pb.expr_col("a") > 2,
lambda df: pb.expr_col("b") < 7,
lambda df: pb.expr_col("a") + pb.expr_col("b") < pb.expr_col("c")
)
.interrogate()
)
validation
Using expr_col()
allows your validation code to work consistently across Pandas, Polars, and Ibis table backends without changes, making your validation pipelines more portable.
See Also
Look at the documentation of the expr_col()
function for more information on how to use it with different table backends.