Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Karapace 3.4.x consuming all CPU #488

Closed
juha-aiven opened this issue Nov 11, 2022 · 0 comments · Fixed by #490
Closed

Karapace 3.4.x consuming all CPU #488

juha-aiven opened this issue Nov 11, 2022 · 0 comments · Fixed by #490
Labels
bug Something isn't working Regression This worked before, but not anymore

Comments

@juha-aiven
Copy link
Contributor

What happened?

In Karapace 3.4.x it is possible to send HTTP requests to the Kafka REST Proxy causing Karapace process to take almost all CPU (a single core). Additionally Karapace causes high load ~30-50% in Kafka. The situation remain even if the process is left alone (no traffic sent to it) for hours.

The issue was found out when running some manual testing using consumers that are not clean up correctly. The exact steps are quite hard to explain here, since the tester and the exact test steps are a part of a larger project that cannot be shared here,

What else do we need to know?

Karapace 3.4.x has the issue. It's unknown whether older versions have it.

@juha-aiven juha-aiven added the bug Something isn't working label Nov 11, 2022
@jlprat jlprat added the Regression This worked before, but not anymore label Nov 11, 2022
tvainika added a commit that referenced this issue Nov 11, 2022
First fetch min bytes negative values was misconfiguration even for
kafka-python earlier, but worsened with aiokafka.  Add extra
configuration for aiokafka timeout settings tuning.

Fixes #488
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Regression This worked before, but not anymore
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants