Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relay][Dyn] Dynamic TopK Op #6008

Merged
merged 5 commits into from
Jul 10, 2020
Merged

Conversation

mbrookhart
Copy link
Contributor

@zhiics @lixiaoquan @kevinthesun

This splits dynamic TopK out of the standard op, many thanks to @kevinthesun for all of the pieces. :)

Adds the op to the dynamic to static pass, and adds dynamic testing.

Thanks!

@mbrookhart mbrookhart changed the title Dynamic TopK Op [Relay][Dyn] Dynamic TopK Op Jul 7, 2020
Matthew Brookhart added 2 commits July 8, 2020 11:21
out = _make.topk(data, k, axis, ret_type, is_ascend, dtype)
if isinstance(k, Constant):
k = np.asscalar(k.data.asnumpy())
if isinstance(k, Expr):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm concered about this case: If data's shape is not static, it needs to use _dyn_make.topk() too.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I might be missing something. If it's a Relay Constant, we go static. If it's another Relay Expr, we go dynamic, if it's a python literal, we go static. Is there another possibility?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What I meant is data, not k. data can only be Relay Expr. Should we always use _dyn.topk and let DynmaicToStatic determine whether can convert it to static op?

Because TopK's out_shape depends on shape of data, if shape of data is dynamic, out_shape becomes dynamic, we need to use shape function in that case
https://github.com/apache/incubator-tvm/blob/eafb2aa13d6cd223629f17d5f6aab5a8d4fce7f5/src/relay/op/algorithm/topk.cc#L50

shape = infer_shape(data)
if Any_in_shape():
    # go dynamic

Copy link
Contributor

@kevinthesun kevinthesun Jul 9, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Current dyn namespace is for attrs to be Relay Expr which makes shape func data dependent. For data independent shape func we can still relay on "static" version op.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Current dyn namespace is for attrs to be Relay Expr which makes shape func data dependent. For data independent shape func we can still relay on "static" version op.

I understand that. But it's possbile that an op is input shape depandeant but input shape itself is dynamic. In this kind case, we still need to use shape function.

This is a simliar case which doesn't have topk, but has similar issue.

v = var("v", int32)
t0 = arange(0, v, 1)   # output an dynamic shape tensor
t1 = strided_slice(t0, [0], [-1], [1])  # output shape depends on a dynamic input shape, even if attrs are static

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I think this is a bit of complexity around how @icemelon9 and co implemented dynamic shapes in Relay. Basically, any op can take in Any shapes, at which point the graph becomes un-runable on the graph runtime and has to be executed through the VM.

As @kevinthesun said, what we've been working on recently is allowing for non-constant attributes, the dyn namespace is mostly to make non-constant attributes explicit. If we need to separate all dynamic shapes, I think we'd need to implement a vm specific op namespace, and I'm not sure we want to go that far?

Copy link
Contributor

@kevinthesun kevinthesun Jul 9, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lixiaoquan Data independent shape func is still there and can handle dynamic input shape cases. @mbrookhart I think for now we can limit dyn namespace to Expr attrs and later consider how to have a more uniform interface.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for pointing that out, I just realize 'static' version op still have some dynamic shape handling ability. I think I misunderstood 'dyn' namespace before.

@zhiics zhiics merged commit 474d472 into apache:master Jul 10, 2020
@zhiics
Copy link
Member

zhiics commented Jul 10, 2020

Thanks @mbrookhart @lixiaoquan @kevinthesun

trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Jul 14, 2020
* add dynamic topk op

* add topk to dynamic_to_static pass

* fix TF test

* fix pylint
trevor-m pushed a commit to neo-ai/tvm that referenced this pull request Jul 14, 2020
* add dynamic topk op

* add topk to dynamic_to_static pass

* fix TF test

* fix pylint
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants