Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Convolution Backward Op #201

Open
qingyunqu opened this issue Sep 28, 2022 · 1 comment
Open

Add Convolution Backward Op #201

qingyunqu opened this issue Sep 28, 2022 · 1 comment
Labels

Comments

@qingyunqu
Copy link
Contributor

Request description

Hi guys, I'm from ByteDance AML team.

HLO doesn't have op like mhlo.convolution_backward_data or mhlo.convolution_backward_filter, since that HLO lowering backward op to something like reverse + transpose + convolution_forward or convolution_forward + transpose.

But sometimes we want to do convolution data format transformation (like NCHW => NHWC, BTW, NCHW format get from torch_xla/torch-mlir). So it's hard for me to do data format transformation when there isn't a singleton operation.

Because of the above reasons, I wonder if it is possible to add convolution backward op?

@burmako burmako self-assigned this Sep 29, 2022
@burmako burmako added the RFC label Sep 29, 2022
@burmako
Copy link
Contributor

burmako commented Nov 15, 2022

Hi @qingyunqu! Apologies that it's taken quite a bit of time to get back to you - in my reply, I wanted to reference the RFC process and iterating on details took some time.

Overall, this is a very reasonable feature request. Many related opsets have an op like that; here are some examples of these ops, in alphabetical order of the opset name:

For historical context, the philosophy behind HLO has been focused on providing a minimal opset paired with a client API of higher-level abstractions that decompose to the minimal opset. MHLO has inherited this philosophy with the MHLO/CHLO split, although about a dozen of MHLO ops are currently misplaced and would be a better fit for CHLO (see #3 for some additional details). This is the context behind the current state of StableHLO, and one of the key pieces of related work as we're working out the StableHLO philosophy.

To evaluate the possibility of having transpose_conv in the StableHLO opset, I think we need the following:

  • Survey of related work (mostly done above, we could also throw in an analysis of how JAX, PyTorch and TF model this).
  • Design for how the potential transpose_conv op would look like (would it be a fork of convolution in all its glory? or a simplified version thereof?).
  • Analysis of pros and cons of having vs not having this op in StableHLO (I imagine this would boil down to how hard it would be to do pattern matching on the consumer side).
  • Recommendation (include in StableHLO, include in CHLO, don't include).

At the moment, the main focus of the StableHLO team at Google is on finishing the StableHLO v1.0 specification. We have made significant progress with speccing the existing statically-shaped HLO semantics, with the goal of providing a well-understood baseline for future evolution. There's still much spec work to do, so I don't expect that we'll have the throughput to look into transpose_conv in Q4.

That said, anyone in the community can contribute to StableHLO! At today's OpenXLA meeting, @theadactyl shared that we're working on formalizing the OpenXLA governance process and invited everyone to contribute RFCs even if these RFCs would be more or less informal at the moment.

If transpose_conv is something that you feel strongly about and have some cycles for, do consider writing an RFC about this. We currently don't have an RFC template, but something along the lines of what's mentioned above would work. Same for the discussion / decision-making process - we don't have a process for that yet beyond the very high-level governance.md, but we can play it by ear.

@burmako burmako removed their assignment Nov 24, 2022
@burmako burmako changed the title [Feature Request] Add Convolution Backward Op Add Convolution Backward Op Nov 24, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: No status
Development

No branches or pull requests

2 participants