You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
When I try to use the cuba Recurrent blocks in my network, I get RuntimeError: Output 0 of SelectBackward0 is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is forbidden. You can fix this by cloning the output of the custom Function.
I tried this with various networks, and also replaced the Dense layer with the Recurrent layer in the XOR regression or the Oxford Tutorial
Here is the network I am using which is same as the XOR network with recurrent block:
I get this error ... RuntimeError: Output 0 of SelectBackward0 is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is forbidden. You can fix this by cloning the output of the custom Function.
Expected behavior
Normally it should work without any problems, because the network is working well with the Dense layers.
Environment (please complete the following information):
Device: Mac Air M2
OS: MacOS
Lava version [e.g. 0.6.1]
The text was updated successfully, but these errors were encountered:
Describe the bug
When I try to use the cuba Recurrent blocks in my network, I get
RuntimeError: Output 0 of SelectBackward0 is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is forbidden. You can fix this by cloning the output of the custom Function.
I tried this with various networks, and also replaced the Dense layer with the Recurrent layer in the XOR regression or the Oxford Tutorial
Here is the network I am using which is same as the XOR network with recurrent block:
To reproduce current behavior
Steps to reproduce the behavior:
RuntimeError: Output 0 of SelectBackward0 is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is forbidden. You can fix this by cloning the output of the custom Function.
Expected behavior
Normally it should work without any problems, because the network is working well with the Dense layers.
Environment (please complete the following information):
The text was updated successfully, but these errors were encountered: