Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pretty printing of Turing models #91

Open
DominiqueMakowski opened this issue May 19, 2023 · 9 comments
Open

Pretty printing of Turing models #91

DominiqueMakowski opened this issue May 19, 2023 · 9 comments

Comments

@DominiqueMakowski
Copy link

I am trying to use TuringGLM to better learn Turing, and so I wrote the simplest LM and would like to re-write that in Turing.

using RDatasets
using TuringGLM

data = RDatasets.dataset("datasets", "mtcars")

fm = @formula(MPG ~ WT)
model = turing_model(fm, data)
model
DynamicPPL.Model{TuringGLM.var"#normal_model#19"{Int64,Int64,CustomPrior},(:y, :X, :predictors, :μ_X, :σ_X, :prior, :residual),(:predictors, :μ_X, :σ_X, :prior, :residual),(),Tuple{Vector{Float64},Matrix{Float64},Int64,Int64,Int64,CustomPrior,Float64},Tuple{Int64,Int64,Int64,CustomPrior,Float64},DynamicPPL.DefaultContext}(TuringGLM.var"#normal_model#19"{Int64,Int64,CustomPrior}(0, 1, CustomPrior(TDist{Float64}(ν=3.0), LocationScale{Float64,Continuous,TDist{Float64}}(
                μ:19.2σ:5.411498097545447ρ:TDist{Float64}(ν=3.0)
            ), nothing), Core.Box(TuringGLM.var"#normal_model#18#20"(Core.Box(TuringGLM.var"#normal_model#19"{Int64,Int64,CustomPrior}())))), (y=[21.0, 21.0, 22.8, 21.4, 18.7, 18.1, 14.3, 24.4, 22.8, 19.2 … 15.2, 13.3, 19.2, 27.3, 26.0, 30.4, 15.8, 19.7, 15.0, 21.4], X=[2.62; 2.875…3.57; 2.78;;], predictors=1, μ_X=0, σ_X=1, prior=CustomPrior(TDist{Float64}(ν=3.0), LocationScale{Float64,Continuous,TDist{Float64}}( #= circular reference @-4 =#
                μ:19.2σ:5.411498097545447ρ:TDist{Float64}(ν=3.0)
            ), nothing), residual=6.026948052089105), (predictors=1, μ_X=0, σ_X=1, prior=CustomPrior(TDist{Float64}(ν=3.0), LocationScale{Float64,Continuous,TDist{Float64}}(
                μ:19.2σ:5.411498097545447ρ:TDist{Float64}(ν=3.0)
            ), nothing), residual=6.026948052089105), DynamicPPL.DefaultContext())

Unfortunately, I can't get my head around this output to format in a standard Turing syntax. Is there a way to "pretty-print" or format the Turing model call? Thanks!

@storopoli
Copy link
Member

This is how Turing.jl prints the model: as a Julia struct.
There is nothing special about how TuringGLM handles the model.
We just parse the formula as a Turing model and delegates that to Turing.

@torfjelde is there a way to get the underlying model of an instantiated Turing model?

@torfjelde
Copy link
Member

is there a way to get the underlying model of an instantiated Turing model?

What do you mean by this? As in, what information are hoping to extract? As you say, it's just a struct and so you can access the arguments it's been instantiated with, etc.

As for the pretty-printing, I agree we should do something. I'll make an issue over at DPPL.

@storopoli
Copy link
Member

What do you mean by this?

The underlying macro call.
Like:

@model function demo(x, y)
  # Assumptions
  σ2 ~ InverseGamma(2, 3)
  σ = sqrt(σ2)
  μ ~ Normal(0, σ)

  # Observations
  x ~ Normal(μ, σ)
  y ~ Normal(μ, σ)
end

@torfjelde
Copy link
Member

torfjelde commented Jun 1, 2023 via email

@storopoli
Copy link
Member

They want to have the model macro when they create a model using turing_model and @formula.

@torfjelde
Copy link
Member

torfjelde commented Jun 1, 2023

Gotcha. That would have to be provided by TuringGLM if so. A macro generates code, so once you apply a macro, e.g. @model, you don't know what it used to look like.

@storopoli
Copy link
Member

Yeah, but I don't want any scaffolding on Turing from TuringGLM. That would need a TuringGLM custom type to hold that as a field (maybe a String).

@DominiqueMakowski
Copy link
Author

My rationale for this was two-fold:

  • Educational: I know than many people learned for instance Stan by fitting models in brms and then inspecting the Stan code, and so I think that many new Julia transitionners might start with TuringGLM, but might be interested in seeing what the model looks like in raw Turing
  • Clarity: When we print the model now it is not super clear what priors/model have been used which can impede easy and transparent reporting. Having some pretty printing like "these are the priors used:" - even if not exactly presented like a Turing model - would be useful in that regards :)

@torfjelde
Copy link
Member

I don't have the time to pursue this right now, but you could do something like this:

struct TuringGLMModel{M,F,P}
    model::M
    formula::F
    priors::P
    # other stuff
    # ...
end

and then just display this in a different way than what you would for typical Turing.jl models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants