Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prepare branch for the TFP 0.12.0 release #1199

Merged
merged 36 commits into from
Dec 22, 2020
Merged

Conversation

jburnim
Copy link
Member

@jburnim jburnim commented Dec 22, 2020

No description provided.

brianwa84 and others added 30 commits December 21, 2020 10:44
Hello,

this PR again implements the stopping ratio logistic distribution which was already discussed and reviewed [here](tensorflow#963).

@srvasude , sorry for closing the other PR. I worked in your requested changes other than the transpose that needs to be done when sampling.

Thanks.

Cheers,
Simon

COPYBARA_INTEGRATE_REVIEW=tensorflow#990 from dirmeier:stopping_ratio 0743b54
PiperOrigin-RevId: 346376187
PiperOrigin-RevId: 346378914
…tyle-docstrings

PiperOrigin-RevId: 346395749
…utions, some nested distributions, and a mean field option.

PiperOrigin-RevId: 346440980
This means one must only 'import tensorflow_probability' to load saved models
that have serialized TFP keras layers and TFP CompositeTensor specs.

PiperOrigin-RevId: 346639386
…he `Restructure` bijector.

PiperOrigin-RevId: 346669515
This is adapted from `SampleDiscardingKernel`, sans burn-in; it has the advantage of not wrapping `KernelResults`.

PiperOrigin-RevId: 346698326
PiperOrigin-RevId: 346793654
PiperOrigin-RevId: 346910502
PiperOrigin-RevId: 347085672
…ated as "_composite_tensor_shape_parameters".

Prior to the fix, test fails with "ValueError: Input tensor 'IndependentNormal_2/log_prob/Sum:0' enters the loop with shape (), but has shape <unknown> after one iteration. To allow the shape to vary across iterations, use the `shape_invariants` argument of tf.while_loop to specify a less-specific shape."

PiperOrigin-RevId: 347446310
…straining bijector.

PiperOrigin-RevId: 347448699
…dSampleDistribution`.

PiperOrigin-RevId: 347528419
…on_test.testLogProbSample`.

Re-enable after http://b/175654800 is resolved.

PiperOrigin-RevId: 347535479
PiperOrigin-RevId: 347660796
… contexts.

Add a property-based test that tries to create Distributions in eager mode then
sample from them in graph mode, to exercise the failure mode.

PiperOrigin-RevId: 347692243
Adds `tfp.experimental.distributions.log_prob_ratio(p, x, q, y) = p(x) - q(y)`.

Custom implementations are registered for `tfd.Independent`, `tfd.Sample`, `tfd.JointDistribution*`, `tfd.TransformedDistribution`, `tfb.Chain`, and `tfb.ScaleMatvecDiag`.

MVNDiag is tested as a proof of concept, in transformed_distribution_test.

PiperOrigin-RevId: 347822296
brianwa84 and others added 6 commits December 21, 2020 12:16
Also fixes an XLA compilation issue (no StringFormat, PrintV2 ops in XLA) introduced by the logdet degree-of-freedom warning. This was exposed by the new test in deterministic_test.py (Deterministic default bijector uses Chain, which uses composition.py).

PiperOrigin-RevId: 347832464
Implement burn-in by sequencing two calls of run_kernel, and thinning
by inserting a ThinningKernel into the kernel onion.

Also change the default tracing function to account for the fact that
the chain state history is no longer returned separately by default.

Delete tests that no longer make sense.

PiperOrigin-RevId: 347836706
Previously, had the exception: "TypeError: Tensors in list passed to 'inputs' of 'AddN' Op have types [float64, float64, float32] that don't all match."

PiperOrigin-RevId: 347871205
Gamma(concentration=0) always samples 0.
ExpGamma(concentration=0) always samples -inf.
Beta(concentration1=0) always samples 0.
Beta(concentration0=0) always samples 1.
BetaBinomial(concentration1=0) always samples 0.
BetaBinomial(concentration0=0) always samples total_counts.
likewise for Dirichlet and DirichletMultinomial.

Not changing the validation because (i) inertia, and (ii) any of these
distributions is still degenerate with a 0 concentration, so should
arguably be avoided when possible.

PiperOrigin-RevId: 347876446
PiperOrigin-RevId: 348378916
@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla: yes Declares that the user has signed CLA
Projects
None yet
Development

Successfully merging this pull request may close these issues.