You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have used PyMC3 a few times do statistical learnings tasks (both in NLP and in political science). It has a pretty nice Python API. One of the rough points (from my limited perspective) is that it uses Theano for the math, so sometimes it becomes a bit confusing why things are going slowly and how to make them faster. From their doc on how they use Theano, it seems like it might be able to be replaced with gumath and xnd?
I think looking into this possibility at least, once gumath is more functional, would be interesting.
There is a newer probabilistic programming library called Edward, that is built on top of Tensorflow. I am also interested in seeing what that would look like using gumath.
At the least, investigating these things would give a good sense of what the limits of the xnd stack are and what libraries like Theano and Tensorflow add on top of them.
The text was updated successfully, but these errors were encountered:
If we don’t want to block ourselves from being able to implement RHMC in the future I think we should pick a backend where higher order differentiation is a first class citizen. From brief googling none of the listed packages can fully claim this (although PyTorch seems to be moving in that direction). Autograd, claiming just that, was mentioned elsewhere, but was said to be slow. Are there some benchmarks?
The google tangent seems like the right level of abstraction for pymc’s backend. I haven’t used it much so don’t have any real world experience, found about it recently and just tried some examples.
I have used PyMC3 a few times do statistical learnings tasks (both in NLP and in political science). It has a pretty nice Python API. One of the rough points (from my limited perspective) is that it uses Theano for the math, so sometimes it becomes a bit confusing why things are going slowly and how to make them faster. From their doc on how they use Theano, it seems like it might be able to be replaced with gumath and xnd?
I think looking into this possibility at least, once gumath is more functional, would be interesting.
There is a newer probabilistic programming library called Edward, that is built on top of Tensorflow. I am also interested in seeing what that would look like using gumath.
At the least, investigating these things would give a good sense of what the limits of the xnd stack are and what libraries like Theano and Tensorflow add on top of them.
The text was updated successfully, but these errors were encountered: