Skip to content

Commit

Permalink
[MAINT] Fix for #3346 (#3352)
Browse files Browse the repository at this point in the history
* Fix for #3346. draw_values had a statement that said that if the name of a variable was in the supplied point, then the value for said variable would be taken from the point dictionary. This was too permissive, and different from what was done in _draw_value. There, the value could be taken from the point dictionary only if the variable had the attribute model, i.e. it was an RV instance instead of a Deterministic.

* Changed expected SMC exact step values.

* Added comment explaining why we check for the model attribute of parameters to get their values from the point dictionary.
  • Loading branch information
lucianopaz authored and ColCarroll committed Jan 25, 2019
1 parent 254c574 commit cdb69ec
Show file tree
Hide file tree
Showing 4 changed files with 230 additions and 202 deletions.
1 change: 1 addition & 0 deletions RELEASE-NOTES.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
- Added the `broadcast_distribution_samples` function that helps broadcasting arrays of drawn samples, taking into account the requested `size` and the inferred distribution shape. This sometimes is needed by distributions that call several `rvs` separately within their `random` method, such as the `ZeroInflatedPoisson` (Fix issue #3310).
- The `Wald`, `Kumaraswamy`, `LogNormal`, `Pareto`, `Cauchy`, `HalfCauchy`, `Weibull` and `ExGaussian` distributions `random` method used a hidden `_random` function that was written with scalars in mind. This could potentially lead to artificial correlations between random draws. Added shape guards and broadcasting of the distribution samples to prevent this (Similar to issue #3310).
- Added a fix to allow the imputation of single missing values of observed data, which previously would fail (Fix issue #3122).
- Fix for #3346. The `draw_values` function was too permissive with what could be grabbed from inside `point`, which lead to an error when sampling posterior predictives of variables that depended on shared variables that had changed their shape after `pm.sample()` had been called.
- Fix for #3354. `draw_values` now adds the theano graph descendants of `TensorConstant` or `SharedVariables` to the named relationship nodes stack, only if these descendants are `ObservedRV` or `MultiObservedRV` instances.

### Deprecations
Expand Down
5 changes: 3 additions & 2 deletions pymc3/distributions/distribution.py
Original file line number Diff line number Diff line change
Expand Up @@ -309,7 +309,8 @@ def draw_values(params, point=None, size=None):
# param was drawn in related contexts
v = drawn[(p, size)]
evaluated[i] = v
elif name is not None and name in point:
# We filter out Deterministics by checking for `model` attribute
elif name is not None and hasattr(p, 'model') and name in point:
# param.name is in point
v = point[name]
evaluated[i] = drawn[(p, size)] = v
Expand Down Expand Up @@ -495,7 +496,7 @@ def _draw_value(param, point=None, givens=None, size=None):

dist_tmp.shape = distshape
try:
dist_tmp.random(point=point, size=size)
return dist_tmp.random(point=point, size=size)
except (ValueError, TypeError):
# reset shape to account for shape changes
# with theano.shared inputs
Expand Down
26 changes: 26 additions & 0 deletions pymc3/tests/test_sampling.py
Original file line number Diff line number Diff line change
Expand Up @@ -302,6 +302,32 @@ def test_model_not_drawable_prior(self):
samples = pm.sample_posterior_predictive(trace, 50)
assert samples['foo'].shape == (50, 200)

def test_model_shared_variable(self):
x = np.random.randn(100)
y = x > 0
x_shared = theano.shared(x)
y_shared = theano.shared(y)
with pm.Model() as model:
coeff = pm.Normal('x', mu=0, sd=1)
logistic = pm.Deterministic('p', pm.math.sigmoid(coeff * x_shared))

obs = pm.Bernoulli('obs', p=logistic, observed=y_shared)
trace = pm.sample(100)

x_shared.set_value([-1, 0, 1.])
y_shared.set_value([0, 0, 0])

samples = 100
with model:
post_pred = pm.sample_posterior_predictive(trace,
samples=samples,
vars=[logistic, obs])

expected_p = np.array([logistic.eval({coeff: val})
for val in trace['x'][:samples]])
assert post_pred['obs'].shape == (samples, 3)
assert np.allclose(post_pred['p'], expected_p)

def test_deterministic_of_observed(self):
meas_in_1 = pm.theanof.floatX(2 + 4 * np.random.randn(100))
meas_in_2 = pm.theanof.floatX(5 + 4 * np.random.randn(100))
Expand Down
Loading

0 comments on commit cdb69ec

Please sign in to comment.