Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MergeComposite] Fix InferType when module contains Prelude #5797

Merged
merged 1 commit into from
Jun 16, 2020

Conversation

lixiaoquan
Copy link
Contributor

A function may refer to other resources in the same module, so keep
the content of original module when infering a function.

cc @mbaret @mbrookhart @comaniac

  A function may refer to other resources in the same module, so keep
  the content of original module when infering a function.
Copy link
Contributor

@mbrookhart mbrookhart left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

For my edification, is there a tutorial on what Prelude actually does somewhere? It's not something I've been able to fit into my mental model of TVM yet.

Copy link
Contributor

@comaniac comaniac left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for pointing out. IMHO, the bug also indicates that this is probably the only use case that creates a new module from a transformed function in order to run infer type. We should think about a better way to do so.

auto mod = IRModule::FromExpr(expr);
Function InferType(const Function& expr, const IRModule& m) {
IRModule mod(m);
mod->Update(mod->GetGlobalVar("main"), expr);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since now we update the original module with new transformed function, should we update the corresponding function instead of main?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mmm, you're right, reading the pass infrastructure a little more today.

The FunctionPass, however, doesn't seem to pass the information on what Function we're see down to the passes: https://github.com/apache/incubator-tvm/blob/8578096853eec5711bfcc9a3a68145fd12a135cb/src/relay/ir/transform.cc#L123-L132

I guess we can either change that API (which touches a lot of passes), or maybe invert this Map https://github.com/apache/incubator-tvm/blob/4347b41a5e64a2a453297b371232d6e101051b3c/include/tvm/ir/module.h#L53, find the global var, and store that in the class.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or maybe we can rewrite MergeComposite to be a module pass so that we can iterate functions by ourselves.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since now we update the original module with new transformed function, should we update the corresponding function instead of main?

For now, other functions in module don't call 'main', so it is safe to replace 'main'. If we are infering function which is mutated from 'main', that's just what we want, if we are infering other functions, it will be a duplicated function but it doesn't harm. And, with only a mutated function, it seems we can't find it in original global_var_map_ because we don't know its crossponding global variable's name.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make sense, although I still prefer to avoid dummy modules. I think it's fine to let his PR in first, and maybe I can fix that later on.

@lixiaoquan
Copy link
Contributor Author

LGTM

For my edification, is there a tutorial on what Prelude actually does somewhere? It's not something I've been able to fit into my mental model of TVM yet.

TensorFlow frontend generates a module with Prelude.

https://github.com/apache/incubator-tvm/blob/7a419718c121164fc260864014e1d0d81f556949/python/tvm/relay/frontend/tensorflow.py#L2729

@masahi
Copy link
Member

masahi commented Jun 16, 2020

PyTorch frontend also uses Prelude. It contains List ADT (as in functional programming) and functions on lists (cons, map, filter, concat, length, nth etc).

It allows creating and manipulating dynamic lists. For pytorch frontend, this is necessary for supporting python list append, stacking a list of tensors, for example.

Prelude is mostly rewritten in the Relay "language" (file extension with .rly). Relay has a parser for that language which translate text rep to C++ rep, also usable from python.

@masahi masahi self-assigned this Jun 16, 2020
@masahi masahi merged commit 6ed8d7a into apache:master Jun 16, 2020
@masahi
Copy link
Member

masahi commented Jun 16, 2020

Thanks @lixiaoquan @comaniac @mbrookhart

trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Jun 30, 2020
)

A function may refer to other resources in the same module, so keep
  the content of original module when infering a function.
zhiics pushed a commit to neo-ai/tvm that referenced this pull request Jul 2, 2020
)

A function may refer to other resources in the same module, so keep
  the content of original module when infering a function.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants