Skip to content

Commit

Permalink
👕 Rename tutorial
Browse files Browse the repository at this point in the history
  • Loading branch information
o-laurent committed Oct 11, 2023
1 parent cffea9c commit c939bbe
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ We provide the following tutorials in our documentation:
- [From a Vanilla Classifier to a Packed-Ensemble](https://torch-uncertainty.github.io/auto_tutorials/tutorial_pe_cifar10.html)
- [Training a Bayesian Neural Network in 3 minutes](https://torch-uncertainty.github.io/auto_tutorials/tutorial_bayesian.html)
- [Improve Top-label Calibration with Temperature Scaling](https://torch-uncertainty.github.io/auto_tutorials/tutorial_scaler.html)
- [Deep Evidential Regression Tutorial](https://torch-uncertainty.github.io/auto_tutorials/tutorial_der_cubic.html)
- [Deep Evidential Regression on a Toy Example](https://torch-uncertainty.github.io/auto_tutorials/tutorial_der_cubic.html)

## Awesome Uncertainty repositories

Expand Down
4 changes: 2 additions & 2 deletions auto_tutorials_source/tutorial_der_cubic.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@
# coding: utf-8

"""
Deep Evidential Regression Tutorial
===================================
Deep Evidential Regression on a Toy Example
===========================================
This tutorial aims to provide an introductory overview of Deep Evidential Regression (DER) using a practical example. We demonstrate an application of DER by tackling the toy-problem of fitting :math:`y=x^3` using a Multi-Layer Perceptron (MLP) neural network model. The output layer of the MLP has four outputs, and is trained by minimizing the Normal Inverse-Gamma (NIG) loss function.
Expand Down

0 comments on commit c939bbe

Please sign in to comment.