Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update approval-voting-regression-bench #5504

Merged
merged 3 commits into from
Aug 28, 2024
Merged

Conversation

alexggh
Copy link
Contributor

@alexggh alexggh commented Aug 27, 2024

The accepted divergence rate of 1/1000 is excessive and leads to false positives especially after #4772 and #5042, so let's increase it to 1/100 since we do have some randomness in the system and there is no point in being that strict.

Fixes: #5463

The accepted divergence rate of 1/1000 is execesive and leads to false
positives especially after #4772 and
#5042, so let's increase it to 1/100
since we do have some randomness in the system and there is no point in
being that strict.

Signed-off-by: Alexandru Gheorghe <[email protected]>
@alexggh alexggh added the R0-silent Changes should not be mentioned in any release notes label Aug 27, 2024
@AndreiEres
Copy link
Contributor

0.001 was set deliberately, because we always sure that we send same amount of data every time. Current standard deviation for values after last change is 0.0005 with average 63999.3651.

@alexggh
Copy link
Contributor Author

alexggh commented Aug 27, 2024

0.001 was set deliberately, because we always sure that we send same amount of data every time.

That's not really the case for approval-voting benchmarks because when the messages are generated they are not always getting the same seed, so you deterministic approach here some small deviation is acceptable and 0.001 seems to strict too me.

@AndreiEres
Copy link
Contributor

0.001 was set deliberately, because we always sure that we send same amount of data every time.

That's not really the case for approval-voting benchmarks because when the messages are generated they are not always getting the same seed, so you deterministic approach here some small deviation is acceptable and 0.001 seems to strict too me.

I'm ok with 0.01, but I think we should change the base because it's not a base anymore. The base is more like 63999.3651.

Signed-off-by: Alexandru Gheorghe <[email protected]>
Signed-off-by: Alexandru Gheorghe <[email protected]>
@alexggh alexggh enabled auto-merge August 28, 2024 08:46
@alexggh alexggh added this pull request to the merge queue Aug 28, 2024
Merged via the queue into master with commit f0fd083 Aug 28, 2024
189 of 191 checks passed
@alexggh alexggh deleted the alexggh/fix_bench_network branch August 28, 2024 09:35
ordian added a commit that referenced this pull request Aug 29, 2024
* master: (39 commits)
  short-term fix for para inherent weight overestimation (#5082)
  CI: Add backporting bot (#4795)
  Fix benchmark failures when using `insecure_zero_ed` flag (#5354)
  Command bot GHA v2 - /cmd <cmd> (#5457)
  Remove pallet::getter usage from treasury (#4962)
  Bump blake2b_simd from 1.0.1 to 1.0.2 (#5404)
  Bump rustversion from 1.0.14 to 1.0.17 (#5405)
  Bridge zombienet tests: remove old command (#5434)
  polkadot-parachain: Add omni-node variant with u64 block number (#5269)
  Refactor verbose test (#5506)
  Use umbrella crate for minimal template (#5155)
  IBP Coretime Polkadot bootnodes (#5499)
  rpc server: listen to `ipv6 socket` if available and `--experimental-rpc-endpoint` CLI option (#4792)
  Update approval-voting-regression-bench (#5504)
  change try-runtime rpc domains (#5443)
  polkadot-parachain-bin: Remove contracts parachain (#5471)
  Add feature to allow Aura collator to use full PoV size (#5393)
  Adding stkd bootnodes (#5470)
  Make `PendingConfigs` storage item public (#5467)
  frame-omni-bencher maintenance (#5466)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
R0-silent Changes should not be mentioned in any release notes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[subsystem-bench] approval-voting bench: network increased
4 participants