Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tracking PyTorch 1.10.0 additions / changes #416

Closed
NiklasGustafsson opened this issue Oct 26, 2021 · 0 comments · Fixed by #806
Closed

Tracking PyTorch 1.10.0 additions / changes #416

NiklasGustafsson opened this issue Oct 26, 2021 · 0 comments · Fixed by #806

Comments

@NiklasGustafsson
Copy link
Contributor

NiklasGustafsson commented Oct 26, 2021

New functions

torch.segment_reduce (#59951, #60018, #61141, #61266, #59521, #60379, #60379)

Function changes

Added channels-last support for torch.bilinear and torch.nn,MaxUnpool2d (#56322, #49984)

Modules

Added lr_scheduler.SequentialLR (#64037, #65035)

nn.{ReflectionPad3d, LazyInstanceNorm*d} (#59791, #60837, #61308, #60982)
nn.CrossEntropyLoss: Added support for class probability targets (#61044)
nn.CrossEntropyLoss: Added support for label smoothing (#63122)
nn.Module: Added support for arbitrary objects in state_dicts via get_extra_state() / set_extra_state() (#62976)
nn.utils.skip_init(): Added function to skip module parameter / buffer initialization (#57555)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant