Skip to content

Commit

Permalink
implement ref added
Browse files Browse the repository at this point in the history
  • Loading branch information
tower120 committed Jan 21, 2024
1 parent 3c0213e commit da34fa6
Show file tree
Hide file tree
Showing 10 changed files with 197 additions and 166 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -99,4 +99,4 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- run: RUSTFLAGS="--deny warnings" cargo doc --lib
- run: RUSTFLAGS="--deny warnings" cargo doc --lib --all-features
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
# Changelog

## 0.5.0
### Fix
- `NoCache` reduce version was unsound for iterators which had to be dropped.

### Optimization
- On each level, instead of empty block indices Vec, an intrusive single-linked list is now used.
This completely eliminates this kind of memory overhead. Previously, if you would fill `_256bit` bitset,
Expand All @@ -11,6 +14,8 @@
### Changed
- `BitSetInterface` now have default implementation.
- `BitSetInterface` no longer have `IntoIterator` base.
- `BitSet` no longer implements `BitSetInterface`.
But `&BitSet` still does. This prevents accidental sending bitset by value.

### Added
- `BitBlock::first_u64()`.
Expand Down
1 change: 0 additions & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,6 @@ simd = ["dep:wide"]

[dependencies]
wide = { version = "0.7.13", optional = true }
arrayvec = "0.7.4"

[dev-dependencies]
rand = "0.8"
Expand Down
6 changes: 3 additions & 3 deletions examples/custom_bitset_simple.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
use std::marker::PhantomData;
use std::mem::{ManuallyDrop, MaybeUninit};
use hi_sparse_bitset::config::Config;
use hi_sparse_bitset::{BitBlock, BitSetBase, impl_bitset, impl_simple_bitset};
use hi_sparse_bitset::{BitBlock, BitSetBase, impl_bitset_simple};
use hi_sparse_bitset::implement::*;

#[derive(Default)]
Expand Down Expand Up @@ -34,8 +34,8 @@ impl<Conf: Config> LevelMasks for Empty<Conf> {
}
}

impl_simple_bitset!(
impl<Conf> for Empty<Conf> where Conf: Config
impl_bitset_simple!(
impl<Conf> for ref Empty<Conf> where Conf: Config
);

fn main(){
Expand Down
33 changes: 24 additions & 9 deletions src/bitset_interface.rs
Original file line number Diff line number Diff line change
Expand Up @@ -43,26 +43,30 @@ pub trait LevelMasks: BitSetBase{
/// I don't know if it will be actually used, so no work is done on top of that.
/// If you do use it, and want it better - open an issue.
///
/// # How it used
/// # How it is used
///
/// See [CachingBlockIter::next()] code to see how it used.
///
/// ```[ignore]
/// let mut state = bitset.make_iter_state();
/// let mut level1_block_data = MaybeUninit::uninit(); // POD
/// let mut level1_block_data = MaybeUninit::new(Default::default());
///
/// fn next() {
/// ...
/// level1_block_data.assume_init_drop();
/// let (level1_mask, is_not_empty) = bitset.update_level1_block_data(state, level1_block_data, level0_index);
/// ...
/// let bitblock = data_mask_from_block_data(level1_block_data, level1_index);
///
/// return bitblock;
/// }
///
/// level1_block_data.assume_init_drop();
/// bitset.drop_iter_state(state);
/// ```
///
/// [Reduce]: crate::Reduce
/// [Apply]: crate::Apply
/// [CachingBlockIter::next()]: crate::iter::CachingBlockIter::next()
pub trait LevelMasksIterExt: LevelMasks{
/// Consists from child states (if any) + Self state.
Expand Down Expand Up @@ -103,15 +107,16 @@ pub trait LevelMasksIterExt: LevelMasks{
/// `level1_block_data` will come in undefined state - rewrite it completely.
///
/// `is_not_empty` is not used by iterator itself, but can be used by other
/// generative bitsets (namely [Reduce]) - we expect compiler to optimize away that non-used code.
/// generative bitsets (namely [Reduce]) - we expect compiler to optimize away non-used code.
/// It exists - because sometimes you may have faster ways of checking emptiness,
/// then checking simd register (bitblock) for zero in general case.
/// For example, in BitSet - this is checking of block indirection index for zero.
/// For example, in BitSet - it is done by checking of block indirection index for zero.
///
/// # Safety
///
/// indices are not checked.
///
/// [Reduce]: crate::Reduce
// Performance-wise it is important to use this in-place construct style,
// instead of just returning Level1BlockData. Even if we return Level1BlockData,
// and then immoderately write it to MaybeUninit - compiler somehow still can't
Expand All @@ -126,10 +131,8 @@ pub trait LevelMasksIterExt: LevelMasks{
/// # Safety
///
/// indices are not checked.
///
/// P.S. It can actually accept &self as well - but that was never needed.
unsafe fn data_mask_from_block_data(
/*&self,*/ level1_block_data: &Self::Level1BlockData, level1_index: usize
level1_block_data: &Self::Level1BlockData, level1_index: usize
) -> <Self::Conf as Config>::DataBitBlock;
}

Expand Down Expand Up @@ -209,11 +212,23 @@ impl<'a, T: LevelMasksIterExt> LevelMasksIterExt for &'a T {
/// [^traverse_def]: Under "traverse" we understand function application for
/// each element of bitset.
///
/// # Implementation
///
/// Consider using [impl_bitset!] instead of implementing it manually.
///
/// Implementing BitSetInterface for T will make it passable by value to [apply], [reduce].
/// That may be not what you want, if your type contains heavy data, or your
/// [LevelMasksIterExt] implementation depends on *Self being stable during iteration.
/// If that is the case - implement only for &T.
///
/// [CachingBlockIter]: crate::iter::CachingBlockIter
/// [CachingIndexIter]: crate::iter::CachingIndexIter
pub trait BitSetInterface
/// [LevelMasksIterExt]: crate::implement::LevelMasksIterExt
/// [impl_bitset!]: crate::impl_bitset!
/// [apply]: crate::apply()
/// [reduce]: crate::reduce()
pub unsafe trait BitSetInterface
: BitSetBase
//+ IntoIterator<IntoIter = DefaultIndexIterator<Self>>
+ LevelMasksIterExt
+ Sized
{
Expand Down
Loading

0 comments on commit da34fa6

Please sign in to comment.