-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Optimize recommend()
with bulk prediction
#64
Conversation
by pre-computing an approximated user-item matrix
Codecov ReportBase: 80.30% // Head: 80.37% // Increases project coverage by
Additional details and impacted files@@ Coverage Diff @@
## master #64 +/- ##
==========================================
+ Coverage 80.30% 80.37% +0.06%
==========================================
Files 26 26
Lines 853 856 +3
==========================================
+ Hits 685 688 +3
Misses 168 168
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
Since an entire approximated matrix is stored in a recommender, making bulk prediction can be efficient by extracting values of given indices.
Avoid multithreading for simplicity; builk predict is now called a thread that calls `recommend`, and hence nested thread spawns many cause complication.
Otherwise, aggregated metric may take an incomplete list of top-k recs containing undef.
recommend()
with bulk prediction
In particular, accelerate calculating matrix factorization-based recommendation scores by pre-computing (caching) an approximated user-item matrix.