This report analyzes how well the model quantifies uncertainty in its predictions through intervals and probabilities.
Model: {{ model_type }}
Date: {{ timestamp }}
{% if uncertainty_score is defined and uncertainty_score is not none %} {{ (uncertainty_score * 100)|round|int }}% {% else %} N/A {% endif %}
{% if uncertainty_score is not defined or uncertainty_score is none %} No uncertainty score available {% elif uncertainty_score >= 0.8 %} Excellent uncertainty quantification {% elif uncertainty_score >= 0.6 %} Good uncertainty quantification {% elif uncertainty_score >= 0.4 %} Moderate uncertainty quantification {% else %} Needs improvement in uncertainty quantification {% endif %}
This section provides a high-level overview of the model's uncertainty quantification performance.
This section shows how well the model's prediction intervals cover the true values.
Alpha Level | Expected Coverage | Actual Coverage | Interval Width | Coverage Error |
---|
This section shows the distribution of uncertainty across predictions.
Detailed analysis of model performance at different confidence levels (alpha).
Comprehensive metrics for uncertainty evaluation.
Metric | Value | Description |
---|
Analysis of the model's probability calibration.
Metric | Value | Description |
---|