Uncertainty Quantification Analysis: {{ model_name }}

This report analyzes how well the model quantifies uncertainty in its predictions through intervals and probabilities.

Model: {{ model_type }}

Date: {{ timestamp }}

Uncertainty Score

{% if uncertainty_score is defined and uncertainty_score is not none %} {{ (uncertainty_score * 100)|round|int }}% {% else %} N/A {% endif %}

{% if uncertainty_score is not defined or uncertainty_score is none %} No uncertainty score available {% elif uncertainty_score >= 0.8 %} Excellent uncertainty quantification {% elif uncertainty_score >= 0.6 %} Good uncertainty quantification {% elif uncertainty_score >= 0.4 %} Moderate uncertainty quantification {% else %} Needs improvement in uncertainty quantification {% endif %}

Performance Metrics

Coverage Score {% if coverage_score is defined and coverage_score is not none %} {{ (coverage_score * 100)|round(2) }}% {% else %} N/A {% endif %}
Calibration Error {% if calibration_error is defined and calibration_error is not none %} {{ (calibration_error * 100)|round(2) }}% {% else %} N/A {% endif %}
Sharpness {% if sharpness is defined and sharpness is not none %} {{ sharpness|round(3) }} {% else %} N/A {% endif %}
Consistency {% if consistency is defined and consistency is not none %} {{ (consistency * 100)|round(2) }}% {% else %} N/A {% endif %}

Executive Summary

This section provides a high-level overview of the model's uncertainty quantification performance.

Key Findings

  • Overall uncertainty score: {% if uncertainty_score is defined and uncertainty_score is not none %}{{ (uncertainty_score * 100)|round|int }}%{% else %}N/A{% endif %}
  • Prediction intervals have {% if coverage_score is defined and coverage_score is not none %}{{ (coverage_score * 100)|round(1) }}%{% else %}N/A{% endif %} coverage (target: {{ target_coverage|default('90') }}%)
  • Calibration error: {% if calibration_error is defined and calibration_error is not none %}{{ (calibration_error * 100)|round(2) }}%{% else %}N/A{% endif %}
  • Average interval width: {% if sharpness is defined and sharpness is not none %}{{ sharpness|round(3) }}{% else %}N/A{% endif %}

Recommendations

    {% if uncertainty_score >= 0.8 %}
  • Model uncertainty estimates are well-calibrated and reliable
  • Consider using this model in production with confidence
  • {% elif uncertainty_score >= 0.6 %}
  • Model uncertainty estimates are generally reliable
  • Consider recalibration to improve further
  • {% elif uncertainty_score >= 0.4 %}
  • Model uncertainty estimates need improvement
  • Consider recalibration or using a different uncertainty method
  • {% else %}
  • Model uncertainty estimates are not reliable
  • Do not use these uncertainty estimates for decision making
  • Consider retraining the model with a better uncertainty quantification method
  • {% endif %}

Coverage Analysis

This section shows how well the model's prediction intervals cover the true values.

Coverage Summary

Alpha Level Expected Coverage Actual Coverage Interval Width Coverage Error

Uncertainty Distribution

This section shows the distribution of uncertainty across predictions.

Alpha Level Details

Detailed analysis of model performance at different confidence levels (alpha).

Detailed Metrics

Comprehensive metrics for uncertainty evaluation.

Metric Value Description

Calibration Analysis

Analysis of the model's probability calibration.

Calibration Metrics

Metric Value Description