Skip to content

Commit 8aae7b6

Browse files
author
Vianney Taquet
committed
Fix equation in tutorial_regression
1 parent 2459abf commit 8aae7b6

File tree

1 file changed

+20
-17
lines changed

1 file changed

+20
-17
lines changed

examples/regression/4-tutorials/plot_main-tutorial-regression.py

Lines changed: 20 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -2,29 +2,31 @@
22
===============================
33
Tutorial for tabular regression
44
===============================
5+
"""
56

6-
In this tutorial, we compare the prediction intervals estimated by MAPIE on a
7-
simple, one-dimensional, ground truth function :math:`f(x) = x \times sin(x)`.
7+
##############################################################################
8+
# In this tutorial, we compare the prediction intervals estimated by MAPIE on a
9+
# simple, one-dimensional, ground truth function
10+
# :math:`f(x) = x \times \sin(x)`.
811

9-
Throughout this tutorial, we will answer the following questions:
12+
# Throughout this tutorial, we will answer the following questions:
1013

11-
- How well do the MAPIE strategies capture the aleatoric uncertainty
12-
existing in the data?
14+
# - How well do the MAPIE strategies capture the aleatoric uncertainty
15+
# existing in the data?
1316

14-
- How do the prediction intervals estimated by the resampling strategies
15-
evolve for new *out-of-distribution* data ?
17+
# - How do the prediction intervals estimated by the resampling strategies
18+
# evolve for new *out-of-distribution* data ?
1619

17-
- How do the prediction intervals vary between regressor models ?
20+
# - How do the prediction intervals vary between regressor models ?
1821

19-
Throughout this tutorial, we estimate the prediction intervals first using
20-
a polynomial function, and then using a boosting model, and a simple neural
21-
network.
22+
# Throughout this tutorial, we estimate the prediction intervals first using
23+
# a polynomial function, and then using a boosting model, and a simple neural
24+
# network.
2225

23-
**For practical problems, we advise using the faster CV+ or
24-
Jackknife+-after-Bootstrap strategies.
25-
For conservative prediction interval estimates, you can alternatively
26-
use the CV-minmax strategies.**
27-
"""
26+
# **For practical problems, we advise using the faster CV+ or
27+
# Jackknife+-after-Bootstrap strategies.
28+
# For conservative prediction interval estimates, you can alternatively
29+
# use the CV-minmax strategies.**
2830

2931
import os
3032
import subprocess
@@ -477,7 +479,7 @@ def get_heteroscedastic_coverage(y_test, y_pis, STRATEGIES, bins):
477479
)
478480

479481
# fig = plt.figure()
480-
heteroscedastic_coverage.T.plot.bar(figsize=(12, 4), alpha=0.7)
482+
heteroscedastic_coverage.T.plot.bar(figsize=(12, 5), alpha=0.7)
481483
plt.axhline(0.95, ls="--", color="k")
482484
plt.ylabel("Conditional coverage")
483485
plt.xlabel("x bins")
@@ -785,6 +787,7 @@ def mlp():
785787
ax=ax,
786788
title=name
787789
)
790+
plt.show()
788791

789792

790793
fig, ax = plt.subplots(1, 1, figsize=(7, 5))

0 commit comments

Comments
 (0)