You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(LDA): Search complexity in SparseVector (word-topic and doc-topic vector) is log(K). Please consider to use HashVector in Breeze (or OpenAddressHashArray in Breeze) that is with O(1)
#28
Open
hucheng opened this issue
Aug 26, 2015
· 2 comments
During sampleToken, when computing probability of Nkd*(Nkw+beta), given a topic k in Nkd, find the corresponding Nkw of that topic k is O(Kw). If we choose to use HashVector, the complexity can be reduced to O(1), even with a little bit space overhead.
Since HashVector in Breeze is not serializable, so please consider to directly use OpenAddressHashArray in Breeze.
The text was updated successfully, but these errors were encountered:
Note that the aggregating (sum) of two HashVectors cost more than sum of two sparseVector. So the key is to avoid the new operation but with in-place add. Please be referred to aggregateByKey.
I tried several open address Hashmap implementations used as HashVector backend, but surprisingly they are all slower than SparseVector (Java's HashMap >> Spark's OpenHashMap > Breeze's OpenAddressHashArray > fastutil's int-int map). Maybe because the conflicts happen at the average of log(K) times.
But now this issue is perfectly solved using a very easy trick: before we sample all the edges of one term, we transform the term's count vector into DenseVector first. Then finding n_kw will just need O(1).
During sampleToken, when computing probability of Nkd*(Nkw+beta), given a topic k in Nkd, find the corresponding Nkw of that topic k is O(Kw). If we choose to use HashVector, the complexity can be reduced to O(1), even with a little bit space overhead.
Since HashVector in Breeze is not serializable, so please consider to directly use OpenAddressHashArray in Breeze.
The text was updated successfully, but these errors were encountered: