-
Notifications
You must be signed in to change notification settings - Fork 5.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[RLlib] Hot fix for PPOTorchRLModule._compute_values
with non-shared stateful encoder and batch slicing with non-empty info
s.
#44082
Conversation
…ompute_values' when using a non-shared stateful encoder. In addition fixed an error that occurs while slicing batches with non-empty infos. Signed-off-by: Simon Zehnder <[email protected]>
PPOTorchRLModule._compute_values
with non-shared stateful encoder and batch slicing with non-empty info
sPPOTorchRLModule._compute_values
with non-shared stateful encoder and batch slicing with non-empty info
s.
@@ -716,7 +716,9 @@ def _batch_slice(self, slice_: slice) -> "SampleBatch": | |||
|
|||
# Exclude INFOs from regular array slicing as the data under this column might | |||
# be a list (not good for `tree.map_structure` call). | |||
infos = self.get(SampleBatch.INFOS) | |||
# Furthermore, slicing does not work when the data in the column is |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You mean a SampleBatch with B=0, correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
B>0
. But in this case the infos
are a list of dicts. When they are empty, tree.map_structure(infos)
works, but when they are filled, tree.map_structure
will fail as it tries to apply the slicing on singular values slicing fails.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, thanks for the fix @simonsays1980!
@@ -90,6 +90,10 @@ def _compute_values(self, batch, device=None): | |||
|
|||
# Separate vf-encoder. | |||
if hasattr(self.encoder, "critic_encoder"): | |||
if self.is_stateful(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome! Ran into this issue yesterday as well (and continued testing then with a shared value function :) ).
Why are these changes needed?
Running
PPO
withuse_lstm=True
andvf_share_layers=False
results in an error in thePPOTorchRLModule._compute_values
method as the specs checker expects a different spec for thestate_in
:Exctracting the
state_in
for thecritic
solves this problem.Another problem is solved related to non-empty infos in batch slicing (mainly occuring in
MinibatchIterator
s). The reason is that slicing viatree.map_structure
tries to slice also the entries of theinfo
s which are usually singular values:Related issue number
Checks
git commit -s
) in this PR.scripts/format.sh
to lint the changes in this PR.method in Tune, I've added it in
doc/source/tune/api/
under thecorresponding
.rst
file.