Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Save posteriors as a FITS/HDF file #25

Closed
avivajpeyi opened this issue Sep 17, 2020 · 4 comments · Fixed by #35
Closed

Save posteriors as a FITS/HDF file #25

avivajpeyi opened this issue Sep 17, 2020 · 4 comments · Fixed by #35

Comments

@avivajpeyi
Copy link
Collaborator

No description provided.

@avivajpeyi
Copy link
Collaborator Author

Look into arviz

@dfm
Copy link
Owner

dfm commented Sep 18, 2020

I think this is what we want: https://github.com/arviz-devs/arviz/blob/6589319d7bc596416602010e5e0998a57237ef17/arviz/data/inference_data.py#L215

@avivajpeyi
Copy link
Collaborator Author

Also found this: pymc-devs/pymc#2189

QS for @dfm, are we ok with

  • saving the posteriors from the pymc3.backends.base.MultiTrace as a HDF5 file
  • load the posterior HDF5 file as a pandas DF
  • not having the loaded object be a pymc3.backends.base.MultiTrace

If we are ok with saving/loading the posteriors from a pd.DataFrame, then

posteriors = pm.trace_to_dataframe(trace)

makes it quite easy :)

@dfm
Copy link
Owner

dfm commented Sep 28, 2020

I think the following would be a better procedure:

import pymc3 as pm
import arviz as az
with pm.Model() as model:
    x = pm.Normal("x")
    trace = pm.sample()
    data = az.from_pymc3(trace)
data.to_netcdf("chain.netcdf")

Then we could load it using:

import arviz as az
new_data = az.from_netcdf("chain.netcdf")

This has the benefit that it stores all the sampling metadata as well as the samples themselves. We could do that manually, but this is probably better!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants