Mean Absolute Percentage Error (MAPE) Metric¶
The mape metric calculates the Mean Absolute Percentage Error, measuring the accuracy of predicted values against observed values as a percentage. MAPE provides a scale-independent measure of prediction accuracy that is easy to interpret.
Metric Type: mape
MAPE Calculation¶
The MAPE is calculated as: mean(|observed - predicted| / |observed|) * 100
Where:
- observed = Observed values (actual values)
- predicted = Predicted values
- The result is expressed as a percentage
MAPE is a relative measure that expresses accuracy as a percentage, making it useful for comparing models across different scales. A value of 0% indicates perfect prediction accuracy.
Configuration Fields¶
Record-Level Data Format¶
For individual observation records:
collections:
model_mape:
metrics:
- name:
- prediction_accuracy
data_format: record
observed: observed_values
predicted: predicted_values
segment:
- - model_version
metric_type: mape
dataset: predictions
Summary-Level Data Format¶
For pre-aggregated percentage error data:
collections:
summary_mape:
metrics:
- name:
- aggregated_mape
data_format: summary
volume: observation_count
sum_absolute_percentage_errors: sape
segment:
- - data_source
metric_type: mape
dataset: error_summary
Required Fields by Format¶
Record-Level Required¶
name: Metric name(s)data_format: Must be "record"observed: Observed values column namepredicted: Predicted values column namedataset: Dataset reference
Summary-Level Required¶
name: Metric name(s)data_format: Must be "summary"volume: Volume count column namesum_absolute_percentage_errors: Sum of absolute percentage errors column namedataset: Dataset reference
Optional Fields¶
segment: List of column names for grouping
Output Columns¶
The metric produces the following output columns:
group_key: Segmentation group identifier (struct of segment values)volume: Total number of observationsmape: Mean Absolute Percentage Error value (as percentage)
Fan-out Examples¶
Single Configuration¶
collections:
basic_mape:
metrics:
- name:
- model_mape
data_format: record
observed: actual_values
predicted: predicted_values
metric_type: mape
dataset: validation_data
Segmented Analysis¶
collections:
segmented_mape:
metrics:
- name:
- regional_mape
- product_mape
data_format: record
observed: observed_values
predicted: predicted_values
segment:
- - region
- - product_type
metric_type: mape
dataset: performance_data
Mixed Data Formats¶
collections:
detailed_mape:
metrics:
- name:
- record_mape
data_format: record
observed: actual
predicted: predicted
metric_type: mape
dataset: detailed_data
summary_mape:
metrics:
- name:
- summary_mape
data_format: summary
volume: count
sum_absolute_percentage_errors: sape
metric_type: mape
dataset: summary_data
Data Requirements¶
Record-Level Data¶
- One row per observation
- Observed column: numeric values (must be non-zero to avoid division by zero)
- Predicted column: numeric values (any numeric value is allowed)
- Observed values should not contain zeros or values very close to zero
Summary-Level Data¶
- One row per group/segment
- Volume counts: positive numbers
- Sum of absolute percentage errors: positive numbers
MAPE Interpretation¶
Value Guidelines¶
- 0%: Perfect prediction accuracy
- < 10%: Highly accurate predictions
- 10-20%: Good prediction accuracy
- 20-50%: Reasonable prediction accuracy
- > 50%: Poor prediction accuracy
Scale Independence¶
- MAPE is scale-independent, making it useful for comparing predictions across different data ranges
- Results are expressed as percentages, making them intuitive to interpret
- Can be used to compare model performance across different datasets
Important Notes¶
- Zero Division: MAPE cannot be calculated when observed values are zero or very close to zero
- Asymmetric: MAPE is asymmetric - over-predictions and under-predictions are treated differently
- Scale Independence: Unlike RMSE, MAPE is not affected by the scale of the data
- Percentage Format: Results are expressed as percentages (0-100+)
- Data Quality: Remove observations where observed values are zero or missing before calculation
- Outlier Sensitivity: MAPE can be sensitive to outliers, especially when observed values are small