Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relay, TF Frontend] Dilation2D operator support #5033

Merged
merged 20 commits into from
Mar 17, 2020

Conversation

maheshambule
Copy link
Contributor

@maheshambule maheshambule commented Mar 10, 2020

Added support for the Dilation2D operator.
It is supported in Tensorflow.
https://www.tensorflow.org/api_docs/python/tf/nn/dilation2d

Currently, generic implementation is added. The device-specific which is there for other convolution operators is not added.

cc: @yongwww, @zhiics, @kevinthesun, @masahi, @comaniac

Comment on lines 2419 to 2420
out_layout : Optional[str]
Layout of the output, by default, out_layout is the same as data_layout
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

out_layout is not part of parameter

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed

Comment on lines 39 to 50
dilation: int or a list/tuple of two ints
dilation size, or [dilation_height, dilation_width]
Returns
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add out_dtype description

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added

Comment on lines 101 to 122
dilation: int or a list/tuple of two ints
dilation size, or [dilation_height, dilation_width]
Returns
-------
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add out_dtype here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added

@@ -442,6 +442,57 @@ def conv1d_transpose_strategy(attrs, inputs, out_type, target):
name="conv1d_transpose_ncw.generic")
return strategy


# dilation2d
def wrap_compute_dilation2d(topi_compute, need_data_layout=False, need_out_layout=False):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

need_out_layout is not used?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

removed

from .util import get_pad_tuple


def dilation2d_nchw(input, filter, stride, padding, dilation, out_dtype=None):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In relay op we use rate as the name for dilation rate, but here we use dilation. Do we want to keep them consistent?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Now using 'dilations' instead of 'rates' everywhere. This will be consistent with TF2's Dilation2d op and also with other TVM Conv operators as well.

@kevinthesun
Copy link
Contributor

kevinthesun commented Mar 13, 2020

It looks like commit history of this PR has diverged?

@maheshambule
Copy link
Contributor Author

@kevinthesun, Yes working on it.

@maheshambule
Copy link
Contributor Author

@kevinthesun, @siju-samuel Thanks for the review. Review comments are incorporated. Let me know if there are further comments.

Copy link
Contributor

@kevinthesun kevinthesun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@kevinthesun kevinthesun merged commit 646cfc6 into apache:master Mar 17, 2020
@kevinthesun
Copy link
Contributor

Thanks @maheshambule @siju-samuel

trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Apr 16, 2020
* update docs for dilation 2d

* dilation2d compute

* dilation2d register

* dilation2d rel compute

* dilation2d strategy

* dilation2d attrs

* dilation2d generic schedule

* dilation2d tf frontend support

* dilation2d tf frontend test case

* dilation2d test cases

* pylint fixes

* add exception for cuda target

* Update docstring

* Update docstring

* change rates to dilations

* removed unused param

* merge master

* Update nn.py

* Update nn.py
zhiics pushed a commit to neo-ai/tvm that referenced this pull request Apr 17, 2020
* update docs for dilation 2d

* dilation2d compute

* dilation2d register

* dilation2d rel compute

* dilation2d strategy

* dilation2d attrs

* dilation2d generic schedule

* dilation2d tf frontend support

* dilation2d tf frontend test case

* dilation2d test cases

* pylint fixes

* add exception for cuda target

* Update docstring

* Update docstring

* change rates to dilations

* removed unused param

* merge master

* Update nn.py

* Update nn.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants