Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tensor.softmax() not support #914

Closed
williamlzw opened this issue Feb 15, 2023 · 1 comment · Fixed by #915
Closed

tensor.softmax() not support #914

williamlzw opened this issue Feb 15, 2023 · 1 comment · Fixed by #915

Comments

@williamlzw
Copy link

https://github.com/ultralytics/ultralytics/blob/main/ultralytics/nn/modules.py
line80
self.conv(x.view(b, 4, self.c1, a).transpose(2, 1).softmax(1)).view(b, 4, a)

softmax(1)? not support in torchsharp

@NiklasGustafsson
Copy link
Contributor

NiklasGustafsson commented Feb 15, 2023

torch.softmax() is undocumented, so it's not been included in TorchSharp:

https://pytorch.org/docs/stable/search.html?q=torch.softmax&check_keywords=yes&area=default

However, it was reported recently in a bug, and a fix will appear in the next release. In the meantime, use torch.special.softmax() or torch.nn.functional.softmax()

@NiklasGustafsson NiklasGustafsson linked a pull request Feb 15, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants