Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak #7

Closed
mfrister opened this issue Apr 5, 2018 · 11 comments
Closed

Memory leak #7

mfrister opened this issue Apr 5, 2018 · 11 comments

Comments

@mfrister
Copy link

mfrister commented Apr 5, 2018

sentry-kubernetes on our two clusters leaks memory, at least memory usage has only gone up during a few weeks of usage:

screen_shot_2018-04-04_at_15 04 45

We have RBAC enabled and given sentry-kubernetes' service account the ClusterRole view.

As a workaround, we've now set the following resource requests/limits for the sentry-kubernetes container:

resources:
  requests:
    memory: 75Mi
    cpu: 5m
  limits:
    memory: 100Mi
    cpu: 30m

We expect these limits to restart the container every few days.

@bretthoerner
Copy link
Contributor

Interesting. I don't maintain any of my own global state, so it must be something inside of the kubernetes or raven-python clients, hmmm...

@dneuhaeuser-zalando
Copy link

I observe the same problem with the pod getting OOM killed quite frequently.

screen shot 2018-09-12 at 12 01 40

Green is usage, light blue is limit. The cases on the graph were the cliff occurs before the limit are probably due to the pod being rescheduled during a cluster upgrade.

@stephenlacy
Copy link

I am getting this as well. In our case it used 6GB of ram before getting force evicted by kubernetes

@stephenlacy
Copy link

I ended up creating a lightweight reporter in golang that uses 7-10mb ram total and reports pod failures:
https://github.com/stevelacy/go-sentry-kubernetes

@wichert
Copy link

wichert commented Jun 4, 2019

I wonder if this is due to the use of breadcrumbs?

@wichert
Copy link

wichert commented Oct 22, 2019

FWIW I made another alternative: https://github.com/wichert/k8s-sentry . That means there are now three alternatives:

  • this version. Currently not really usable since it has this memory leak and isn't actively maintained.
  • @stevelacy's go-sentry-kubernetes. A small Go reporter without memory leak. This monitors pods for changes and reports those to Sentry. Includes very little information in error messages.
  • my k8s-sentry. Another small Go reporter. This uses the same approach as this project: it monitors events, and submits warning and errors events to Sentry. It includes a fair bit of information from events (object kind, namespace, component, event reason, event message, action taken, etc.). I might extend it to load fetch involved objects as well so it can add their labels and extra information as well (which would solve Affected Object Labels as Tags #13).

@stephenlacy
Copy link

stephenlacy commented Oct 22, 2019

@wichert I like the way you parsed the events with the AddFunc rather than the UpdateFunc. I'll probably change mine in a similar way, coercing the description and reason didn't do as well as I hoped.

Edit: my go client now also provides detailed error information

@zifeo
Copy link

zifeo commented Oct 22, 2019

@wichert Great to see some alternative, do you also provide a manifest or helm file?

@wichert
Copy link

wichert commented Oct 22, 2019

@zifeo I will, once I have a first release and an image on docker hub.

@wichert
Copy link

wichert commented May 12, 2020

@zifeo I forgot to mention, but I have example manifests now.

@tonyo
Copy link
Contributor

tonyo commented Nov 28, 2023

Hi,
The agent has a completely different implementation now (rewritten in Go), so I'll close this one as outdated.
Thanks everyone for the discussion and listing alternatives 👍

@tonyo tonyo closed this as completed Nov 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants