Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Typed IR passes #67

Closed
MikeInnes opened this issue Jul 30, 2018 · 4 comments
Closed

Typed IR passes #67

MikeInnes opened this issue Jul 30, 2018 · 4 comments

Comments

@MikeInnes
Copy link
Contributor

Related to #65, there are various reasons to want to implement Cassette-like passes over typed IR, and more generally to have control over Julia's optimisation stack in a given context. Semantic code transformations should always be possible on untyped IR, but we might want to, for example, implement custom optimisation passes – like exploiting model parallelism in ML.

This is a fairly vague discussion issue; it will take some back-and-forth with the core Julia compiler folks to work out a good approach.

@jrevels jrevels mentioned this issue Jul 30, 2018
@jrevels
Copy link
Collaborator

jrevels commented Jul 30, 2018

Centralizing discussion of this topic here, pasting from the OP of #65:

I think that IR should eventually be the default with lowered code hidden from view, but this isn't possible right now as phi nodes are only accepted by the compiler post-type-inference. One workaround is to work on code_typed, preserving type information as much as possible, and then allow generated functions to return typed code without trying to re-infer it. It's hacky but it does get the job done.

Pasting from my reply:

Using Cassette as a front end to post-type-inference pass injection is potentially feasible, though I wonder whether it would be better to just build a new injection interface directly into the optimizer...My initial thought is that it's not unlikely that a sane approach to this would require the optimizer to provide a bonafide pass manager anyway.

@jrevels
Copy link
Collaborator

jrevels commented Jul 30, 2018

We should plan a meeting to brainstorm on this at JuliaCon, a lot of interesting directions it could go.

@vchuravy
Copy link
Member

Closing this as a non-goal for Cassette

@MikeInnes
Copy link
Contributor Author

For anyone watching, the current thinking is that the Julia compiler will get more tools for injecting custom inference / optimisation passes, with JuliaLang/julia#33955 being the starting point for that. Mjolnir also provides some prototyping in this direction, which will allow for things like custom optimisation passes in the short term.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants