Turing.jl is a Julia library for general-purpose probabilistic programming. Turing allows the user to write models using standard Julia syntax, and provides a wide range of sampling-based inference methods for solving problems across probabilistic machine learning, Bayesian statistics, and data science. Compared to other probabilistic programming languages, Turing has a special focus on modularity, and decouples the modelling language (i.e. the compiler) and inference methods. This modular design, together with the use of a high-level numerical language Julia, makes Turing particularly extensible: new model families and inference methods can be easily added.
Current features include:
- General-purpose probabilistic programming with an intuitive modelling interface
- Robust, efficient Hamiltonian Monte Carlo (HMC) sampling for differentiable posterior distributions
- Particle MCMC sampling for complex posterior distributions involving discrete variables and stochastic control flows
- Compositional inference via Gibbs sampling that combines particle MCMC, HMC, Random-Walk MH (RWMH) and Elliptical Slice Sampling
- Advanced variational inference based on ADVI and Normalising Flows
Turing's home page, with links to everything you'll need to use Turing is:
https://turing.ml/dev/docs/using-turing/get-started
See releases.
Turing was originally created and is now managed by Hong Ge. Current and past Turing team members include Hong Ge, Kai Xu, Martin Trapp, Mohamed Tarek, Cameron Pfiffer, Tor Fjelde. You can see the full list of on Github: https://github.com/TuringLang/Turing.jl/graphs/contributors.
Turing is an open source project so if you feel you have some relevant skills and are interested in contributing then please do get in touch. See the Contributing page for details on the process. You can contribute by opening issues on Github or implementing things yourself and making a pull request. We would also appreciate example models written using Turing.
Issues related to bugs and feature requests are welcome on the issues page, while discussions and questions about statistical applications and theory should can place on the Discussions page or our channel (#turing
) in the Julia Slack chat. If you do not already have an invitation to Julia's Slack, you can get one by going here.
- The Stan language for probabilistic programming - Stan.jl
- Bare-bones implementation of robust dynamic Hamiltonian Monte Carlo methods - DynamicHMC.jl
- Comparing performance and results of mcmc options using Julia - MCMCBenchmarks.jl
If you use Turing for your own research, please consider citing the following publication: Hong Ge, Kai Xu, and Zoubin Ghahramani: Turing: a language for flexible probabilistic inference. AISTATS 2018 pdf bibtex