Skip to content

Commit

Permalink
Merge pull request #107 from JuliaTrustworthyAI/90-average-calibration
Browse files Browse the repository at this point in the history
90 average calibration
  • Loading branch information
pat-alt authored Sep 3, 2024
2 parents b638fcb + e212365 commit f7bf0f4
Show file tree
Hide file tree
Showing 74 changed files with 32,510 additions and 11,589 deletions.
6 changes: 0 additions & 6 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,12 +31,6 @@ jobs:
- os: windows-latest
version: '1.9'
arch: x64
- os: windows-latest
version: '1'
arch: x64
- os: macOS-latest
version: '1'
arch: x64
- os: macOS-latest
version: '1.9'
arch: x64
Expand Down
2 changes: 1 addition & 1 deletion _freeze/docs/src/index/execute-results/md.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
"hash": "0b3babc9ba412d09f74672b1ac5c443d",
"result": {
"engine": "jupyter",
"markdown": "```@meta\nCurrentModule = LaplaceRedux\n```\n\n![](assets/wide_logo.png)\n\nDocumentation for [LaplaceRedux.jl](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl).\n\n\n# LaplaceRedux\n\n\n\n`LaplaceRedux.jl` is a library written in pure Julia that can be used for effortless Bayesian Deep Learning through Laplace Approximation (LA). In the development of this package I have drawn inspiration from this Python [library](https://aleximmer.github.io/Laplace/index.html#setup) and its companion [paper](https://arxiv.org/abs/2106.14806) [@daxberger2021laplace].\n\n## 🚩 Installation\n\nThe stable version of this package can be installed as follows:\n\n```{.julia}\nusing Pkg\nPkg.add(\"LaplaceRedux.jl\")\n```\n\nThe development version can be installed like so:\n\n```{.julia}\nusing Pkg\nPkg.add(\"https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl\")\n```\n\n## 🏃 Getting Started\n\nIf you are new to Deep Learning in Julia or simply prefer learning through videos, check out this awesome YouTube [tutorial](https://www.youtube.com/channel/UCQwQVlIkbalDzmMnr-0tRhw) by [doggo.jl](https://www.youtube.com/@doggodotjl/about) 🐶. Additionally, you can also find a [video](https://www.youtube.com/watch?v=oWko8FRj_64) of my presentation at JuliaCon 2022 on YouTube. \n\n## 🖥️ Basic Usage\n\n`LaplaceRedux.jl` can be used for any neural network trained in [`Flux.jl`](https://fluxml.ai/Flux.jl/dev/). Below we show basic usage examples involving two simple models for a regression and a classification task, respectively.\n\n### Regression\n\n\n\nA complete worked example for a regression model can be found in the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/regression/). Here we jump straight to Laplace Approximation and take the pre-trained model `nn` as given. Then LA can be implemented as follows, where we specify the model `likelihood`. The plot shows the fitted values overlaid with a 95% confidence interval. As expected, predictive uncertainty quickly increases in areas that are not populated by any training data.\n\n::: {.cell execution_count=3}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:regression)\nfit!(la, data)\noptimize_prior!(la)\nplot(la, X, y; zoom=-5, size=(500,500))\n```\n\n::: {.cell-output .cell-output-display execution_count=4}\n![](index_files/figure-commonmark/cell-4-output-1.svg){}\n:::\n:::\n\n\n\n\n### Binary Classification\n\n\n\nOnce again we jump straight to LA and refer to the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/mlp/) for a complete worked example involving binary classification. In this case we need to specify `likelihood=:classification`. The plot below shows the resulting posterior predictive distributions as contours in the two-dimensional feature space: note how the **Plugin** Approximation on the left compares to the Laplace Approximation on the right.\n\n::: {.cell execution_count=6}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:classification)\nfit!(la, data)\nla_untuned = deepcopy(la) # saving for plotting\noptimize_prior!(la; n_steps=100)\n\n# Plot the posterior predictive distribution:\nzoom=0\np_plugin = plot(la, X, ys; title=\"Plugin\", link_approx=:plugin, clim=(0,1))\np_untuned = plot(la_untuned, X, ys; title=\"LA - raw (λ=$(unique(diag(la_untuned.prior.P₀))[1]))\", clim=(0,1), zoom=zoom)\np_laplace = plot(la, X, ys; title=\"LA - tuned (λ=$(round(unique(diag(la.prior.P₀))[1],digits=2)))\", clim=(0,1), zoom=zoom)\nplot(p_plugin, p_untuned, p_laplace, layout=(1,3), size=(1700,400))\n```\n\n::: {.cell-output .cell-output-display execution_count=7}\n![](index_files/figure-commonmark/cell-7-output-1.svg){}\n:::\n:::\n\n\n## 📢 JuliaCon 2022\n\nThis project was presented at JuliaCon 2022 in July 2022. See [here](https://pretalx.com/juliacon-2022/talk/Z7MXFS/) for details.\n\n## 🛠️ Contribute\n\nContributions are very much welcome! Please follow the [SciML ColPrac guide](https://github.com/SciML/ColPrac). You may want to start by having a look at any open [issues](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl/issues). \n\n## 🎓 References\n\n",
"markdown": "```@meta\nCurrentModule = LaplaceRedux\n```\n\n![](assets/wide_logo.png)\n\nDocumentation for [LaplaceRedux.jl](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl).\n\n\n\n# LaplaceRedux\n\n\n\n`LaplaceRedux.jl` is a library written in pure Julia that can be used for effortless Bayesian Deep Learning through Laplace Approximation (LA). In the development of this package I have drawn inspiration from this Python [library](https://aleximmer.github.io/Laplace/index.html#setup) and its companion [paper](https://arxiv.org/abs/2106.14806) [@daxberger2021laplace].\n\n## 🚩 Installation\n\nThe stable version of this package can be installed as follows:\n\n```{.julia}\nusing Pkg\nPkg.add(\"LaplaceRedux.jl\")\n```\n\nThe development version can be installed like so:\n\n```{.julia}\nusing Pkg\nPkg.add(\"https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl\")\n```\n\n## 🏃 Getting Started\n\nIf you are new to Deep Learning in Julia or simply prefer learning through videos, check out this awesome YouTube [tutorial](https://www.youtube.com/channel/UCQwQVlIkbalDzmMnr-0tRhw) by [doggo.jl](https://www.youtube.com/@doggodotjl/about) 🐶. Additionally, you can also find a [video](https://www.youtube.com/watch?v=oWko8FRj_64) of my presentation at JuliaCon 2022 on YouTube. \n\n## 🖥️ Basic Usage\n\n`LaplaceRedux.jl` can be used for any neural network trained in [`Flux.jl`](https://fluxml.ai/Flux.jl/dev/). Below we show basic usage examples involving two simple models for a regression and a classification task, respectively.\n\n### Regression\n\n\n\nA complete worked example for a regression model can be found in the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/regression/). Here we jump straight to Laplace Approximation and take the pre-trained model `nn` as given. Then LA can be implemented as follows, where we specify the model `likelihood`. The plot shows the fitted values overlaid with a 95% confidence interval. As expected, predictive uncertainty quickly increases in areas that are not populated by any training data.\n\n::: {.cell execution_count=3}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:regression)\nfit!(la, data)\noptimize_prior!(la)\nplot(la, X, y; zoom=-5, size=(500,500))\n```\n\n::: {.cell-output .cell-output-display execution_count=4}\n![](index_files/figure-commonmark/cell-4-output-1.svg){}\n:::\n:::\n\n\n\n\n### Binary Classification\n\n\n\nOnce again we jump straight to LA and refer to the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/mlp/) for a complete worked example involving binary classification. In this case we need to specify `likelihood=:classification`. The plot below shows the resulting posterior predictive distributions as contours in the two-dimensional feature space: note how the **Plugin** Approximation on the left compares to the Laplace Approximation on the right.\n\n::: {.cell execution_count=6}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:classification)\nfit!(la, data)\nla_untuned = deepcopy(la) # saving for plotting\noptimize_prior!(la; n_steps=100)\n\n# Plot the posterior predictive distribution:\nzoom=0\np_plugin = plot(la, X, ys; title=\"Plugin\", link_approx=:plugin, clim=(0,1))\np_untuned = plot(la_untuned, X, ys; title=\"LA - raw (λ=$(unique(diag(la_untuned.prior.P₀))[1]))\", clim=(0,1), zoom=zoom)\np_laplace = plot(la, X, ys; title=\"LA - tuned (λ=$(round(unique(diag(la.prior.P₀))[1],digits=2)))\", clim=(0,1), zoom=zoom)\nplot(p_plugin, p_untuned, p_laplace, layout=(1,3), size=(1700,400))\n```\n\n::: {.cell-output .cell-output-display execution_count=7}\n![](index_files/figure-commonmark/cell-7-output-1.svg){}\n:::\n:::\n\n\n## 📢 JuliaCon 2022\n\nThis project was presented at JuliaCon 2022 in July 2022. See [here](https://pretalx.com/juliacon-2022/talk/Z7MXFS/) for details.\n\n## 🛠️ Contribute\n\nContributions are very much welcome! Please follow the [SciML ColPrac guide](https://github.com/SciML/ColPrac). You may want to start by having a look at any open [issues](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl/issues). \n\n## 🎓 References\n\n",
"supporting": [
"index_files"
],
Expand Down
Loading

0 comments on commit f7bf0f4

Please sign in to comment.