Metrics Helper Functions¶
The metrics module provides convenient helper functions for calculating statistical metrics. These functions provide a simple, direct interface for interactive use.
All metric implementations are internal - users should only use these helper functions.
Available Metrics¶
Accuracy Metrics¶
default_accuracy()- Calculate default accuracy for binary classification modelsead_accuracy()- Calculate Exposure at Default (EAD) accuracyhosmer_lemeshow()- Perform Hosmer-Lemeshow goodness-of-fit testjeffreys_test()- Perform Jeffreys Bayesian calibration testrmse()- Calculate Root Mean Squared Error for predicted vs observed values
Discrimination Metrics¶
auc()- Calculate Area Under the ROC Curvekolmogorov_smirnov()- Calculate Kolmogorov-Smirnov statistic for discrimination testing
Summary Statistics¶
mean()- Calculate mean values with optional segmentationmedian()- Calculate median values with optional segmentation
Function Reference¶
metrics ¶
Metrics package - Public helper functions for statistical calculations.
This package provides the main public interface for calculating statistical metrics. All metric classes are internal implementations and should not be used directly.
Example usage
describe ¶
Describe a metric's inputs and outputs.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
metric_type
|
str
|
The metric type name (e.g., "auc"). |
required |
data_format
|
str | None
|
Optional format filter ("record" or "summary"). |
None
|
Returns:
| Type | Description |
|---|---|
dict[str, Any]
|
Dict with metric_type, data_formats, and per-format metadata. |
list_metrics ¶
Return metric types mapped to supported data formats.
options: show_source: false heading_level: 3 group_by_category: false members_order: source filters: - "!^*"