Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve some logging messages for add_kubernetes_metadata processor #16866

Merged
merged 2 commits into from
Mar 6, 2020

Conversation

flaper87
Copy link
Contributor

@flaper87 flaper87 commented Mar 6, 2020

  • Enhancement

What does this PR do?

Switch from Debug to Error when unrecoveral events happen and add extra debug messages when indexing and matching pods.

I was trying to debug why my configs wouldn't work and I was forced to enable debug to notice that the add_kubernetes_metadata processor was failing because my configs were wrong.

Sadly, my configs weren't working even after I fixed the above and it was a bit painful to figure out what was going on because there weren't enough debug messages to understand what index keys, matches, and metadata was being processed by filebeat.

Why is it important?

It makes operations and debugging easier

Checklist

  • My code follows the style guidelines of this project
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • I have made corresponding change to the default configuration files
  • I have added tests that prove my fix is effective or that my feature works

@flaper87 flaper87 requested a review from a team as a code owner March 6, 2020 09:47
@flaper87 flaper87 force-pushed the fp/kubernetes-logging branch 2 times, most recently from 7d58101 to a41cb96 Compare March 6, 2020 10:28
@andresrc andresrc added [zube]: Inbox Team:Platforms Label for the Integrations - Platforms team labels Mar 6, 2020
@elasticmachine
Copy link
Collaborator

Pinging @elastic/integrations-platforms (Team:Platforms)

@blakerouse blakerouse requested review from a team and removed request for a team March 6, 2020 13:25
Copy link
Contributor

@blakerouse blakerouse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@flaper87 Change looks good. Need to add a changelog entry.

Switch from Debug to Error when unrecoveral events happen and
add extra debug messages when indexing and matching pods.
@flaper87
Copy link
Contributor Author

flaper87 commented Mar 6, 2020

@flaper87 Change looks good. Need to add a changelog entry.

Done, thanks for the review :)

Copy link
Contributor

@blakerouse blakerouse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, looks good.

@@ -86,7 +86,7 @@ func (f *LogPathMatcher) MetadataIndex(event common.MapStr) string {
logp.Debug("kubernetes", "Incoming log.file.path value: %s", source)

if !strings.Contains(source, f.LogsPath) {
logp.Debug("kubernetes", "Error extracting container id - source value does not contain matcher's logs_path '%s'.", f.LogsPath)
logp.Err("Error extracting container id - source value does not contain matcher's logs_path '%s'.", f.LogsPath)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are in process of refactoring log related like logp.Err in #15699. I can create a separate PR for this after this PR gets merged.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you! 😊

@flaper87
Copy link
Contributor Author

flaper87 commented Mar 6, 2020

I can't check the failures in great detail right now. Are these related to the PR or can I send the PR through?

@blakerouse
Copy link
Contributor

@flaper87 they seem unrelated

@flaper87 flaper87 merged commit 1d6323f into master Mar 6, 2020
@flaper87 flaper87 deleted the fp/kubernetes-logging branch March 9, 2020 06:22
@exekias exekias added the needs_backport PR is waiting to be backported to other branches. label Mar 9, 2020
@exekias exekias added v7.7.0 and removed needs_backport PR is waiting to be backported to other branches. labels Mar 9, 2020
exekias pushed a commit to exekias/beats that referenced this pull request Mar 9, 2020
…lastic#16866)

Switch from Debug to Error when unrecoveral events happen and
add extra debug messages when indexing and matching pods.

(cherry picked from commit 1d6323f)
@exekias
Copy link
Contributor

exekias commented Mar 9, 2020

backport was missing, I've opened it: #16893

kaiyan-sheng pushed a commit that referenced this pull request Mar 17, 2020
…16866) (#16893)

Switch from Debug to Error when unrecoveral events happen and
add extra debug messages when indexing and matching pods.

(cherry picked from commit 1d6323f)

Co-authored-by: Flavio Percoco <[email protected]>
@rdner rdner mentioned this pull request Nov 16, 2022
2 tasks
rdner added a commit to rdner/beats that referenced this pull request Nov 16, 2022
This was happening due to the error level logging when the log path
matcher detected a `log.file.path` that does not start with a standard
Docker container log folder `/var/lib/docker/containers` because AKS
dropped support for Docker in September 2022 and switched to containerd.

It looks like this message was not supposed to be on the error level
in the first place since it just means that the matcher didn't
match and it's not an error. But it was mistakenly promoted from the
debug level in elastic#16866 most likely
because the message started with `Error` and looked confusing.

This a partial fix to unblock our customers, but we still need to come
up with the full AKS/containerd support in a follow up change.
rdner added a commit that referenced this pull request Nov 16, 2022
This was happening due to the error level logging when the log path
matcher detected a `log.file.path` that does not start with a standard
Docker container log folder `/var/lib/docker/containers` because AKS
dropped support for Docker in September 2022 and switched to containerd.

It looks like this message was not supposed to be on the error level
in the first place since it just means that the matcher didn't
match and it's not an error. But it was mistakenly promoted from the
debug level in #16866 most likely
because the message started with `Error` and looked confusing.

This a partial fix to unblock our customers, but we still need to come
up with the full AKS/containerd support in a follow up change.
mergify bot pushed a commit that referenced this pull request Nov 16, 2022
This was happening due to the error level logging when the log path
matcher detected a `log.file.path` that does not start with a standard
Docker container log folder `/var/lib/docker/containers` because AKS
dropped support for Docker in September 2022 and switched to containerd.

It looks like this message was not supposed to be on the error level
in the first place since it just means that the matcher didn't
match and it's not an error. But it was mistakenly promoted from the
debug level in #16866 most likely
because the message started with `Error` and looked confusing.

This a partial fix to unblock our customers, but we still need to come
up with the full AKS/containerd support in a follow up change.

(cherry picked from commit 29f0b4c)
mergify bot pushed a commit that referenced this pull request Nov 16, 2022
This was happening due to the error level logging when the log path
matcher detected a `log.file.path` that does not start with a standard
Docker container log folder `/var/lib/docker/containers` because AKS
dropped support for Docker in September 2022 and switched to containerd.

It looks like this message was not supposed to be on the error level
in the first place since it just means that the matcher didn't
match and it's not an error. But it was mistakenly promoted from the
debug level in #16866 most likely
because the message started with `Error` and looked confusing.

This a partial fix to unblock our customers, but we still need to come
up with the full AKS/containerd support in a follow up change.

(cherry picked from commit 29f0b4c)
rdner added a commit that referenced this pull request Nov 17, 2022
This was happening due to the error level logging when the log path
matcher detected a `log.file.path` that does not start with a standard
Docker container log folder `/var/lib/docker/containers` because AKS
dropped support for Docker in September 2022 and switched to containerd.

It looks like this message was not supposed to be on the error level
in the first place since it just means that the matcher didn't
match and it's not an error. But it was mistakenly promoted from the
debug level in #16866 most likely
because the message started with `Error` and looked confusing.

This a partial fix to unblock our customers, but we still need to come
up with the full AKS/containerd support in a follow up change.

(cherry picked from commit 29f0b4c)

Co-authored-by: Denis <[email protected]>
rdner added a commit that referenced this pull request Nov 17, 2022
This was happening due to the error level logging when the log path
matcher detected a `log.file.path` that does not start with a standard
Docker container log folder `/var/lib/docker/containers` because AKS
dropped support for Docker in September 2022 and switched to containerd.

It looks like this message was not supposed to be on the error level
in the first place since it just means that the matcher didn't
match and it's not an error. But it was mistakenly promoted from the
debug level in #16866 most likely
because the message started with `Error` and looked confusing.

This a partial fix to unblock our customers, but we still need to come
up with the full AKS/containerd support in a follow up change.

(cherry picked from commit 29f0b4c)

Co-authored-by: Denis <[email protected]>
chrisberkhout pushed a commit that referenced this pull request Jun 1, 2023
This was happening due to the error level logging when the log path
matcher detected a `log.file.path` that does not start with a standard
Docker container log folder `/var/lib/docker/containers` because AKS
dropped support for Docker in September 2022 and switched to containerd.

It looks like this message was not supposed to be on the error level
in the first place since it just means that the matcher didn't
match and it's not an error. But it was mistakenly promoted from the
debug level in #16866 most likely
because the message started with `Error` and looked confusing.

This a partial fix to unblock our customers, but we still need to come
up with the full AKS/containerd support in a follow up change.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Team:Platforms Label for the Integrations - Platforms team v7.7.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants