-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Scaled canonical gaussian likelihoods #228
Comments
Seems at least loosely related to energy-based models: Also, I was thinking of abstracting quadratic forms, and a quick search found this: |
Sorry, one correction: If I think about it, this this is not a density but a likelihood, so we would like to represent the function class, not the class of measures. That goes in line with what you say! |
Great point. @nignatiadis has been thinking about related issues in #226 I really like this idea of having more structured likelihoods, especially for these special cases where we can handle things analytically. Also loosely related is my experimental code for exponential families here. It's currently completely undocumented, lacking even a simple example -- I'll try to add at least a commented-out example soon, and more after my current crunch time (prepping for a talk) has passed. |
Sometimes there is a conjugate distribution from a different class of distributions than the prior. |
Right. I think this is yet another case where multiple dispatch can help us. Let's look at a simple case: Say you have parameter space We have a few steps to go through:
Even if the posterior is simple to compute, we still break things up into very small steps. That way, we can easily add methods to any one of these steps. I really think that once we have a nice structure for likelihoods, conjugacy will come very cheap. We should think about what whether fusion and pullback are "atomic" or if they might also break apart into smaller steps. As a starting point... What might a type signature look like for fusion, say for a simple example? I guess that discussion should really go into a new issue. |
We would like to represent the bounded measure (but not necessarily probability measure) with density
(so
H
(=Λ
in MT parlance), andF = Λμ
could be called potential parameter)For some choice of
c
this is a probability measure, but the actual value ofc
itself contains important information about the evidence of a Bayesian model with Gaussian posterior represented in this form)The likelihood object should pairing with Gaussian priors (giving a Gaussian posterior),
$$\exp(\tilde c + \tilde Fx + x'\tilde Hx) = \int \exp(c + Fy + y' H y) \kappa(x, dy) $$
support fusion #229, and pullback
where
is a linear Gaussian kernel, with density
The text was updated successfully, but these errors were encountered: