Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle accumulated affiliations efficiently #3738

Merged
merged 3 commits into from
Aug 24, 2022
Merged

Conversation

NelsonVides
Copy link
Collaborator

As described in 33590dc

Affiliations where stored in the mongoose_acc for atomicity, and
performance. If a message processing pipeline in a room starts with
certain affiliations at the time of that message, we don't want to have
some handlers in the pipeline process the message with certain affs,
only to have the affiliations changed concurrently and the next handler,
for the same message, be processed with different ones.
But the code as currently written was quite redundant. The main call to
get the affiliations from the accumulator, was running the hook that
would check the acc, the cache, and the DB. We can skip that hook if
we're already in the context of muc_light and simply check if the
affiliations are already in the accumulator, and only if not then we run
the hook.
Also take the opportunity to tag the affiliations stored in the
accumulator, with the room's jid, just to avoid the (though improbable)
chance of more than one room's affiliations being accumulated under the
same key.


We also take the chance to upgrade the segmented_cache library and use its new functionality.

@mongoose-im

This comment was marked as off-topic.

@codecov
Copy link

codecov bot commented Aug 24, 2022

Codecov Report

Merging #3738 (884ca87) into master (26701b0) will increase coverage by 0.08%.
The diff coverage is 88.23%.

@@            Coverage Diff             @@
##           master    #3738      +/-   ##
==========================================
+ Coverage   82.59%   82.68%   +0.08%     
==========================================
  Files         529      529              
  Lines       33945    33947       +2     
==========================================
+ Hits        28037    28068      +31     
+ Misses       5908     5879      -29     
Impacted Files Coverage Δ
src/mod_muc.erl 74.41% <50.00%> (ø)
src/muc_light/mod_muc_light.erl 85.23% <90.90%> (-0.59%) ⬇️
src/mongoose_user_cache.erl 97.43% <100.00%> (ø)
src/muc_light/mod_muc_light_room.erl 96.34% <100.00%> (-0.05%) ⬇️
...bal_distrib/mod_global_distrib_hosts_refresher.erl 66.66% <0.00%> (-2.23%) ⬇️
src/mod_roster_riak.erl 96.92% <0.00%> (-1.54%) ⬇️
src/muc_light/mod_muc_light_db_mnesia.erl 91.57% <0.00%> (-1.06%) ⬇️
src/mod_roster.erl 78.70% <0.00%> (-0.48%) ⬇️
src/mod_muc_log.erl 62.82% <0.00%> (ø)
... and 9 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

Affiliations where stored in the mongoose_acc for atomicity, and
performance. If a message processing pipeline in a room starts with
certain affiliations at the time of that message, we don't want to have
some handlers in the pipeline process the message with certain affs,
only to have the affiliations changed concurrently and the next handler,
for the same message, be processed with different ones.

But the code as currently written was quite redundant. The main call to
get the affiliations from the accumulator, was running the hook that
would check the acc, the cache, and the DB. We can skip that hook if
we're already in the context of muc_light and simply check if the
affiliations are already in the accumulator, and only if not then we run
the hook.

Also take the opportunity to tag the affiliations stored in the
accumulator, with the room's jid, just to avoid the (though improbable)
chance of more than one room's affiliations being accumulated under the
same key.
@mongoose-im
Copy link
Collaborator

mongoose-im commented Aug 24, 2022

small_tests_24 / small_tests / 884ca87
Reports root / small


small_tests_25 / small_tests / 884ca87
Reports root / small


dynamic_domains_pgsql_mnesia_24 / pgsql_mnesia / 884ca87
Reports root/ big
OK: 3554 / Failed: 0 / User-skipped: 88 / Auto-skipped: 0


ldap_mnesia_24 / ldap_mnesia / 884ca87
Reports root/ big
OK: 1941 / Failed: 0 / User-skipped: 529 / Auto-skipped: 0


dynamic_domains_pgsql_mnesia_25 / pgsql_mnesia / 884ca87
Reports root/ big
OK: 3554 / Failed: 0 / User-skipped: 88 / Auto-skipped: 0


ldap_mnesia_25 / ldap_mnesia / 884ca87
Reports root/ big
OK: 1941 / Failed: 0 / User-skipped: 529 / Auto-skipped: 0


dynamic_domains_mysql_redis_25 / mysql_redis / 884ca87
Reports root/ big
OK: 3537 / Failed: 0 / User-skipped: 105 / Auto-skipped: 0


pgsql_mnesia_24 / pgsql_mnesia / 884ca87
Reports root/ big
OK: 3928 / Failed: 0 / User-skipped: 97 / Auto-skipped: 0


dynamic_domains_mssql_mnesia_25 / odbc_mssql_mnesia / 884ca87
Reports root/ big
OK: 3554 / Failed: 0 / User-skipped: 88 / Auto-skipped: 0


internal_mnesia_25 / internal_mnesia / 884ca87
Reports root/ big
OK: 2062 / Failed: 0 / User-skipped: 408 / Auto-skipped: 0


pgsql_mnesia_25 / pgsql_mnesia / 884ca87
Reports root/ big
OK: 3928 / Failed: 0 / User-skipped: 97 / Auto-skipped: 0


elasticsearch_and_cassandra_25 / elasticsearch_and_cassandra_mnesia / 884ca87
Reports root/ big
OK: 2402 / Failed: 0 / User-skipped: 403 / Auto-skipped: 0


mysql_redis_25 / mysql_redis / 884ca87
Reports root/ big
OK: 3923 / Failed: 0 / User-skipped: 102 / Auto-skipped: 0


mssql_mnesia_25 / odbc_mssql_mnesia / 884ca87
Reports root/ big
OK: 3928 / Failed: 0 / User-skipped: 97 / Auto-skipped: 0


riak_mnesia_24 / riak_mnesia / 884ca87
Reports root/ big
OK: 2241 / Failed: 0 / User-skipped: 396 / Auto-skipped: 0

Copy link
Member

@chrzaszcz chrzaszcz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good 👍

@chrzaszcz chrzaszcz merged commit 42a04dc into master Aug 24, 2022
@chrzaszcz chrzaszcz deleted the segmented_cache_update branch August 24, 2022 14:02
@chrzaszcz chrzaszcz added this to the 6.0.0 milestone Dec 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants