-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a single normal form for sum-product contractions #157
Conversation
Looks like the test failure is an unreduced |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As we discussed offline, this looks great. What is your plan for removing the funsors that this PR obsoletes? Do you want to delete that code in this PR, or break the process into two staged PRs?
It's probably a missing pattern for a |
|
||
|
||
@dispatch(str, str, Variadic[Gaussian, Joint]) | ||
@dispatch(str, str, Variadic[(Gaussian, GaussianMixture)]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this pattern now too narrow?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It shouldn't be, according to this check in master
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this might be caused by normalize
being a bit too aggressive with pushing down exp
, so that we get Cat
s of terms that look like Contraction[..., Tuple[Unary[Exp, Gaussian], Tensor]]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we simply remove Unary[Contraction]
simplification from normalize
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, it might be easier to add normalize
patterns that prevent oversimplification e.g.
@normalize.register(Unary, ExpOp, GaussianMixture)
def normalize_unary_joint(op, arg):
return None
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess I don't see why we'd want to move unary inside. Unary ops are pointwise and are generally cheaper to apply outside of reductions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess I don't see why we'd want to move unary inside
The main use case here (as in optimize
and Contract
/Integrate
already in master
) is in rewriting integrals of large joint distributions. Pushing the unary ops down to the leaves makes pattern matching easier, simplifies the normal form and lets us reuse the optimizer for integrals. In these situations it's unlikely that unary ops would actually be evaluated, since they would eventually be rewritten back into Integrate
s, KL divergences, etc.
I don't see a reason to change this unless we also want to completely decouple Integrate
from Contraction
(i.e. never rewrite Integrate
to Contraction
or vice versa), which I suppose we could discuss separately.
It seems like you really want Could we replace |
It is also tempting to replace
|
It seems like you really want Reduce(Finitary) rather than this monolithic Contraction and NullOp
Hmm, I disagree. Having a single representation is a design choice that
prevents a combinatorial blowup of patterns and of interpreter stack size,
and representing sum product contractions in a single op is important for
performance. I also don't expect users to interact with Contraction - they
only have to write patterns for Binary and Reduce, or for pairwise
contractions.
…On Tue, Sep 17, 2019, 6:45 AM Fritz Obermeyer ***@***.***> wrote:
It seems like you really want Reduce(Finitary) rather than this
monolithic Contraction and NullOp. I think that would be much clearer to
library users and contributors.
Could we replace Binary with Finitary?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_pyro-2Dppl_funsor_pull_157-3Femail-5Fsource-3Dnotifications-26email-5Ftoken-3DAAPQFQGIPX3N4MJBHT6RDU3QKDNOXA5CNFSM4IDEKBU2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD64SM7A-23issuecomment-2D532227708&d=DwMCaQ&c=r2dcLCtU9q6n0vrtnDw9vg&r=tgzAoEQWrtjKpga93C5QjNjp1THqX2RQfAML4ejQshY&m=OCw_ZPNORU5AWrVdUgGXkG7yQFmHFquJaAwp96P7wCI&s=rC9bhIGFD5syc8Xw94v_mOylsi003O200mmJ5Lnf-DA&e=>,
or mute the thread
<https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_notifications_unsubscribe-2Dauth_AAPQFQB4BZ4DW7TXN26UH3LQKDNOXANCNFSM4IDEKBUQ&d=DwMCaQ&c=r2dcLCtU9q6n0vrtnDw9vg&r=tgzAoEQWrtjKpga93C5QjNjp1THqX2RQfAML4ejQshY&m=OCw_ZPNORU5AWrVdUgGXkG7yQFmHFquJaAwp96P7wCI&s=Yjf0-WUfK4PevrMcqYiGIwmYxQmiagDvbTgB6YCZluE&e=>
.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well it works on my bart example now.
Let's try to keep future PRs small and targeted.
Agreed, but I'm not sure replacing |
Let's just merge this and forge ahead. |
Resolves #156
As outlined in #156 I'm using this as a staging branch for the refactoring PRs associated with the proposed changes there:
MultiDelta
to keep separatelog_density
s for eachpoint
(30cbcb1)adjoint
andeinsum
minipyro
Number
Original description:
This PR takes a first step towards the goals in #156 by adding two things:
Contraction
term representing finitary sum-product operations that takes advantage ofmultipledispatch.Variadic
for pattern matching. This is also necessary for Implement recognition of affine transforms #72.normalize
interpretation that uses associative and distributive properties to put linear expressions into something like DNF, grouping operations into singleContraction
terms where possible.As a demonstration, I have included patterns and tests illustrating how
Joint
is equivalent to interleavingeager
andnormalize
.Tested:
test_joint.py
,test_gaussian.py
,test_affine.py
andtest_einsum.py