Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do not fetch recursive pins from pinner unnecessarily #7883

Merged
merged 2 commits into from
Mar 29, 2021

Conversation

gammazero
Copy link
Contributor

When fetching all pins, the recursive pins are fetched from the pinner two times. The second fetch is unnecessary and copies all recursive pins into a slice again.

Additionally, the output channel is now buffered. This allows the goroutine to exit in the case the pinner returns an error and there is no reader for the output channel. This might be possible if a canceled context causes the caller to abandon waiting to read the output of Ls().

When fetching all pins, the recursive pins are fetched from the pinner two times.  The second fetch is unnecessary and copies all recursive pins into a slice again.

Additionally, the output channel is now buffered.  This allows the goroutine to exit in the case the pinner returns an error and there is no reader for the output channel.  This might be possible if a canceled context causes the caller to abandon waiting to read the output of Ls().
@gammazero gammazero added the P4 Very low priority label Jan 27, 2021
@aschmahmann aschmahmann added P4 Very low priority and removed P4 Very low priority labels Jan 27, 2021
@Stebalien Stebalien self-requested a review March 22, 2021 15:24
@@ -220,7 +220,7 @@ func (p *pinInfo) Err() error {

// pinLsAll is an internal function for returning a list of pins
func (api *PinAPI) pinLsAll(ctx context.Context, typeStr string) <-chan coreiface.Pin {
out := make(chan coreiface.Pin)
out := make(chan coreiface.Pin, 1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do we need this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Additionally, the output channel is now buffered. This allows the goroutine to exit in the case the pinner returns an error and there is no reader for the output channel. This might be possible if a canceled context causes the caller to abandon waiting to read the output of Ls().

Ah, I see. Unfortunately, that doesn't really fix the issue as we're not guaranteed to see that the context has been canceled immediately.

See ipfs/interface-go-ipfs-core#62

Copy link
Contributor Author

@gammazero gammazero Mar 22, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, you are correct that if the caller abandons waiting, we may still try to deliver other results before delivering the error... in which case the goroutine is back to blocking on writing to the channel. The idea is that this may help if the caller gives up waiting (context times out, or is canceled) due to waiting too long for results.

Since this does not fix actually prevent leaking the goroutine, I can remove the buffering, and instead add a comment stating that the caller must keep reading results until the channel is closed to prevent leaking the goroutine. Or, provide an implementation similar to ipfs/interface-go-ipfs-core#62

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm fine keeping the buffering just to be nice. I'd also add a comment.

Let's punt on the interface changes for now.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@Stebalien Stebalien merged commit 99309df into master Mar 29, 2021
@Stebalien Stebalien deleted the chore/pin-ls-all-faster branch March 29, 2021 23:04
@aschmahmann aschmahmann mentioned this pull request May 14, 2021
71 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
P4 Very low priority
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants