Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nested grouping-over #66052

Closed
arisonl opened this issue May 11, 2020 · 4 comments
Closed

Nested grouping-over #66052

arisonl opened this issue May 11, 2020 · 4 comments
Labels
enhancement New value added to drive a business result estimate:needs-research Estimated as too large and requires research to break down into workable issues Feature:Alerting/RuleTypes Issues related to specific Alerting Rules Types Team:ResponseOps Label for the ResponseOps team (formerly the Cases and Alerting teams)

Comments

@arisonl
Copy link
Contributor

arisonl commented May 11, 2020

As an alerting user I want to be able to generate separate alert instances by defining a group-over a field within a group-over (nested group-overs). As a solutions-specific example, metrics would like to be able to determine an alert per disk, per host (see 'Separate alerts are sent for each combination of host and disk' section.

@arisonl arisonl added Feature:Alerting Team:ResponseOps Label for the ResponseOps team (formerly the Cases and Alerting teams) labels May 11, 2020
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-alerting-services (Team:Alerting Services)

@pmuellr
Copy link
Member

pmuellr commented May 14, 2020

This presumably for the built-in index threshold? I don't think there's anything alert-generic about this request, but not entirely sure. And if the question is about an existing alert besides the index threshold alert, we'll want an issue for that alert as well (eg, an observability-provided alert).

I think the implication is that we can have multiple top-N groups in the search spec that gets built, where we have a single one today. It seems like it should be a limited grouping (using the same sort of top-N limit), to keep the ES query less expensive (in the worst case).

This will likely be a usefulness blocker: #64268 - the instanceIds for an alert with N groups will be something like group1-group2-...-groupN, so we'll want human-readable versions of those for the UI.

We'll have to think about how to represent the group data in the context here - I guess it will need to be an array of groups, where we have a single group today.

@sorantis
Copy link

sorantis commented May 26, 2020

We're tracking this request here: #65119
A PR has already been merged.

@gmmorris gmmorris added the Feature:Alerting/RuleTypes Issues related to specific Alerting Rules Types label Jul 1, 2021
@gmmorris gmmorris added loe:large Large Level of Effort loe:needs-research This issue requires some research before it can be worked on or estimated and removed loe:large Large Level of Effort labels Jul 14, 2021
@gmmorris gmmorris added enhancement New value added to drive a business result estimate:needs-research Estimated as too large and requires research to break down into workable issues and removed Feature:Alerting labels Aug 13, 2021
@gmmorris gmmorris removed the loe:needs-research This issue requires some research before it can be worked on or estimated label Sep 2, 2021
@kobelb kobelb added the needs-team Issues missing a team label label Jan 31, 2022
@botelastic botelastic bot removed the needs-team Issues missing a team label label Jan 31, 2022
@mikecote
Copy link
Contributor

mikecote commented May 4, 2023

Closing as implemented within O11y.

@mikecote mikecote closed this as completed May 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New value added to drive a business result estimate:needs-research Estimated as too large and requires research to break down into workable issues Feature:Alerting/RuleTypes Issues related to specific Alerting Rules Types Team:ResponseOps Label for the ResponseOps team (formerly the Cases and Alerting teams)
Projects
No open projects
Development

No branches or pull requests

7 participants