Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using xnd under probabilistic programming libraries #8

Open
saulshanabrook opened this issue Mar 23, 2018 · 1 comment
Open

Using xnd under probabilistic programming libraries #8

saulshanabrook opened this issue Mar 23, 2018 · 1 comment

Comments

@saulshanabrook
Copy link
Member

I have used PyMC3 a few times do statistical learnings tasks (both in NLP and in political science). It has a pretty nice Python API. One of the rough points (from my limited perspective) is that it uses Theano for the math, so sometimes it becomes a bit confusing why things are going slowly and how to make them faster. From their doc on how they use Theano, it seems like it might be able to be replaced with gumath and xnd?

I think looking into this possibility at least, once gumath is more functional, would be interesting.

There is a newer probabilistic programming library called Edward, that is built on top of Tensorflow. I am also interested in seeing what that would look like using gumath.

At the least, investigating these things would give a good sense of what the limits of the xnd stack are and what libraries like Theano and Tensorflow add on top of them.

@saulshanabrook
Copy link
Member Author

PyMC4 discussion on what to use: https://discourse.pymc.io/t/pytorch-backend-for-pymc4/

If we don’t want to block ourselves from being able to implement RHMC in the future I think we should pick a backend where higher order differentiation is a first class citizen. From brief googling none of the listed packages can fully claim this (although PyTorch seems to be moving in that direction). Autograd, claiming just that, was mentioned elsewhere, but was said to be slow. Are there some benchmarks?

The google tangent seems like the right level of abstraction for pymc’s backend. I haven’t used it much so don’t have any real world experience, found about it recently and just tried some examples.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant