You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I propose to add a Lie-Trotter sampler to blackjax from here https://proceedings.mlr.press/v162/franzese22a.html. It is an analog of the well-known SGHMC sampler, but it has a more robust theoretical background because it doesn't rely on the assumption of Gaussianty of the noise, introduced by mini-batching.
How does it compare to other algorithms in blackjax?
Experiments in the paper suggest that this sampler allows for a faster convergence due to larger step sizes. My personal experience with this sampler shows that it allows for avoiding the window adaptation procedure.
Where does it fit in blackjax
It will be nice to have this sampler in blackjax because it is more "user-friendly" because it is less sensitive to hyperparameters and initialization.
Are you willing to open a PR?
I can implement this sampler in JAX and I'll try to integrate it into a blackjax, but my knowledge of the internal structure of the package is superficial, so I may need some help with this.
The text was updated successfully, but these errors were encountered:
The Lie-Trotter sampler would be a great addition to the stochastic gradient algorithms in the library. It seems like the Lie-Trotter sampler is exactly the same as SGHMC with a different integrator. If I'm correct about this, the best option is to add a function in sgmcmc/diffusions.py which does the Lie-Trotter sampler's integration and generalize the build_kernel function in sgmcmc/sghmc.py to take in as input the integrator, to then build your own lie_trotter class (which will be very similar to the sghmc class).
Presentation of the new sampler
I propose to add a Lie-Trotter sampler to blackjax from here https://proceedings.mlr.press/v162/franzese22a.html. It is an analog of the well-known SGHMC sampler, but it has a more robust theoretical background because it doesn't rely on the assumption of Gaussianty of the noise, introduced by mini-batching.
How does it compare to other algorithms in blackjax?
Experiments in the paper suggest that this sampler allows for a faster convergence due to larger step sizes. My personal experience with this sampler shows that it allows for avoiding the window adaptation procedure.
Where does it fit in blackjax
It will be nice to have this sampler in blackjax because it is more "user-friendly" because it is less sensitive to hyperparameters and initialization.
Are you willing to open a PR?
I can implement this sampler in JAX and I'll try to integrate it into a blackjax, but my knowledge of the internal structure of the package is superficial, so I may need some help with this.
The text was updated successfully, but these errors were encountered: