Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: fix a memory leak where a dataset would not be fully deleted #1606

Merged

Conversation

westonpace
Copy link
Contributor

If you create/open a dataset, search the dataset (using a vector index) and then drop the dataset it will leak some memory. This is because the dataset has an index cache. The index cache keeps strong pointers to the indices. However, in the IVF case, we were also keeping a strong pointer to the index cache. This caused a cycle and the index and cache kept themselves alive.

This fix downgrades the IVF index's pointer from Arc to Weak to break the cycle. I've verified the fix manually but am not sure of a good way to detect this sort of thing in testing. Perhaps we can investigate running valgrind?

@westonpace westonpace merged commit 5ca577a into lancedb:main Nov 15, 2023
15 of 17 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants