Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix off by one error in inference.py #131

Merged
merged 1 commit into from
Aug 31, 2023
Merged

Fix off by one error in inference.py #131

merged 1 commit into from
Aug 31, 2023

Conversation

gabrieltseng
Copy link
Contributor

@gabrieltseng gabrieltseng commented Jun 6, 2023

Currently, the inference will miss a prediction (and fail during dataframe creation) if x_np.shape[0] % batch_size == 1.

This PR fixes that. Specifically:

>>> import numpy as np
>>> total_size, batch_size = 1185857, 64
>>> x_np = np.ones(total_size)
>>> # current approach
>>> batches = [x_np[i : i + batch_size] for i in range(0, x_np.shape[0] - 1, batch_size)]
>>> sum([len(x) for x in batches])
1185856
>>> # fixed
>>> batches = [x_np[i : i + batch_size] for i in range(0, x_np.shape[0], batch_size)]
>>> sum([len(x) for x in batches])
1185857

@ivanzvonkov
Copy link
Contributor

Nice catch, thank you!

@ivanzvonkov ivanzvonkov merged commit 144723e into main Aug 31, 2023
@ivanzvonkov ivanzvonkov deleted the inference-fix branch August 31, 2023 17:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants