-
Notifications
You must be signed in to change notification settings - Fork 145
Conversation
p2p/multiplexer.py
Outdated
current_batch += 1 | ||
if current_batch >= max_batch_size: | ||
await asyncio.sleep(0) | ||
current_batch = 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why did you chose to release the event loop every 5 items instead of every single item? To me it'd make more sense to have iterators/generators produce as many items as they can if that doesn't require blocking, and then consumers can decide how to batch. However, @pipermerriam found that trio.ReceiveChannel
releases on every iteration (ethereum/lahja#179 (comment)) and maybe we should be consistent with that? Either way, I'm not convinced this is the right place to implement the batching
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was trying to honor the original intent there which seemed to be that it would be a bad idea to release it each time. I don't have a better reason than that, and it would certainly be simpler to just release each time. So I'll update it to release each cycle, if no one thinks it's important to be able to pump out batches of these.
e293459
to
78ba19c
Compare
Anything else for a 👍 @gsalgado ? |
Took >0.8s at least once.
The set of _active_prefixes could change while trying to mark them as completed, so make a copy before passing it in.
011d034
to
dec2b48
Compare
Revert this as soon as it's fixed in py-trie, then py-evm.
820cfc4
to
22b77cb
Compare
What was wrong?
Related to #1980
How was it fixed?
Bonuses:
set
was modified whiletrie_fog
was using it to mark some prefixes as completeTo-Do
Cute Animal Picture