Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sample_prior_predictive and sample_posterior_predictive issues with use_auto_batching=False #293

Closed
dirmeier opened this issue Jul 6, 2020 · 2 comments

Comments

@dirmeier
Copy link
Contributor

dirmeier commented Jul 6, 2020

Hi,

when sampling from the prior predictive with use_auto_batching=False setting sample_shape doesn't work and always a single draw is returned:

dat = tfd.Normal(loc=1.0, scale=1.0).sample(10)

@pm.model
def model():
    m = yield pm.Normal("mean", loc=0.0, scale=1.0)
    lik = yield pm.Normal("lik", loc=m, scale=1.0, observed=dat)
    return lik

mod = model()
ppc = pm.sample_prior_predictive(mod, sample_shape=10, use_auto_batching=False)
ppc = ppc.prior_predictive
print(ppc["model"])

<xarray.DataArray 'model' (chain: 1, draw: 1)>
array([[-0.6240528]], dtype=float32)
Coordinates:
  * chain    (chain) int64 0
  * draw     (draw) int64 0

Similar behaviour is true when sampling from the posterior predictive. Am I missing something here? Colab that shows this is here.

Thanks!

Cheers,
Simon

@lucianopaz
Copy link
Contributor

@dirmeier, the problem is that you didn't specify that m is conditionally_independent=True. This isn't properly documented because I never had the time to write an explanatory notebook on how to manually vectorize a model. The closest thing there is at the moment is this example from the forward sampling functions

@dirmeier
Copy link
Contributor Author

dirmeier commented Jul 6, 2020

Ah, right! Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants