fix: fix a memory leak where a dataset would not be fully deleted #1606
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
If you create/open a dataset, search the dataset (using a vector index) and then drop the dataset it will leak some memory. This is because the dataset has an index cache. The index cache keeps strong pointers to the indices. However, in the IVF case, we were also keeping a strong pointer to the index cache. This caused a cycle and the index and cache kept themselves alive.
This fix downgrades the IVF index's pointer from
Arc
toWeak
to break the cycle. I've verified the fix manually but am not sure of a good way to detect this sort of thing in testing. Perhaps we can investigate running valgrind?