Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change TopK operator to allow dynamic 'k' #1829

Merged
merged 24 commits into from
Mar 7, 2019
Merged

Change TopK operator to allow dynamic 'k' #1829

merged 24 commits into from
Mar 7, 2019

Conversation

hariharans29
Copy link
Contributor

This PR is modifying TopK operator from having 'k' as attribute to having it as an input

@CLAassistant
Copy link

CLAassistant commented Feb 22, 2019

CLA assistant check
All committers have signed the CLA.

@hariharans29
Copy link
Contributor Author

CC: @linkerzhang

Copy link
Member

@linkerzhang linkerzhang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thank you!

@linkerzhang
Copy link
Member

@houseroad any comments please? thanks a lot!

@hariharans29
Copy link
Contributor Author

@houseroad @linkerzhang - Is this change good to go ? Thanks!

@spandantiwari
Copy link
Contributor

@hariharans29 - there was a failing backend test which is why the CI has failed. That test (test_densenet) is disabled for now. Could you try running the CI job again to see if this passes?

Copy link
Member

@houseroad houseroad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry to block:

Yes, we do use topk in pytorch-onnx-caffe2
https://github.com/pytorch/pytorch/blob/master/torch/onnx/symbolic.py#L1305

@spandantiwari
Copy link
Contributor

@houseroad - just FYI - the failing test was test_densenet, which we, coincidentally, disabled today (pytorch/pytorch#17696) for the Upsample PR (#1773).

@hariharans29
Copy link
Contributor Author

Sorry to block:

Yes, we do use topk in pytorch-onnx-caffe2
https://github.com/pytorch/pytorch/blob/master/torch/onnx/symbolic.py#L1305

@houseroad - Sorry I don't quite understand. Could you please elaborate on the changes you are requesting ?

@houseroad
Copy link
Member

@hariharans29 I don't mean you need to change anything, but we need to prepare for the change. Otherwise, it will break our internal pipeline. I am thinking of setting the pytorch onnx exporter to some stable opset by default. So such changes won't cause problem anymore.

@hariharans29
Copy link
Contributor Author

@houseroad - Got it. I will wait for your go-ahead. Thanks.

Copy link
Member

@houseroad houseroad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to ship now.

@houseroad houseroad merged commit 1ec81bc into onnx:master Mar 7, 2019
@hariharans29
Copy link
Contributor Author

@houseroad - Thanks a lot!

hariharans29 added a commit to hariharans29/onnx that referenced this pull request Aug 15, 2019
* More changes

* More changes

* More changes

* More changes

* Check-in modified model for topK

* More changes

* More changes

* More cahnges

* Changes to md files

* More changes

* More changes

* Fix broken build

* Fix build break

* Fix build break

* Fix build break

* More changes

* Fix build break

* revert

* revert file to old state
jcwchen pushed a commit to jcwchen/onnx that referenced this pull request Sep 23, 2020
* More changes

* More changes

* More changes

* More changes

* Check-in modified model for topK

* More changes

* More changes

* More cahnges

* Changes to md files

* More changes

* More changes

* Fix broken build

* Fix build break

* Fix build break

* Fix build break

* More changes

* Fix build break

* revert

* revert file to old state
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants