Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CPU load grows in time #6129

Closed
maskac opened this issue Sep 26, 2020 · 19 comments
Closed

CPU load grows in time #6129

maskac opened this issue Sep 26, 2020 · 19 comments

Comments

@maskac
Copy link

maskac commented Sep 26, 2020

NS 14.0.4 , docker hosting for 150 users
after upgrade from 13.0.1 to 14.0.3 than 14.0.4 CPU load grows in time
image
With 13.0.1 no problem occured.

@sulkaharo
Copy link
Member

sulkaharo commented Sep 27, 2020

Found a memory leak, fixing. Thank you for the report - the increase on my single instance monitor is so small I didn't notice this...

@sulkaharo
Copy link
Member

Based on monitoring the data loads reported from #6133, looks like this fixes the leak. Can you help validate the fix asap, so I can push this out as a point release?

@maskac
Copy link
Author

maskac commented Sep 27, 2020

Now upgrading all nightscouts, will be report in few hours,

@sulkaharo
Copy link
Member

Awesome, thank you! ❤️

@sulkaharo
Copy link
Member

Any news yet? Looking at my instance, I don't see the memory use going up anymore.

@maskac
Copy link
Author

maskac commented Sep 27, 2020

I don't know yet. It's to early for decision.
image

@maskac
Copy link
Author

maskac commented Sep 27, 2020

image

@sulkaharo
Copy link
Member

What's the memory use before you updated, or is that shown on the graph?

@maskac
Copy link
Author

maskac commented Sep 27, 2020

Memory was small problem, CPU load was big problém. At 21:00 i restarted all nightscout. At 13:00 upgrade was made.
image

@maskac
Copy link
Author

maskac commented Sep 27, 2020

first screen shot bad. this is good.
image

@maskac
Copy link
Author

maskac commented Sep 27, 2020

Small peaks are backup process. Grow of cpu load is from nightscout.

@sulkaharo
Copy link
Member

Right ok, didn't at first notice the creep is that slow. Would it be possible for you to add memory monitoring that tracks the node processes? The CPU use in NS is 100% dependent on the data in the runtime and if CPU is creeping but memory use isn't, that implies the memory use is for the VM instances which allocate more than Node needs and the true Node memory use is not actually tracked.

@maskac
Copy link
Author

maskac commented Sep 27, 2020

I not have agent inside, memory use is about 80-100MB / container.
image

@sulkaharo
Copy link
Member

Right - given I got a couple confirmations that this seems to at least fix the issue visible in the logs, I merge this to dev and released as 14.0.5, as this should also fix the data backfills. Let's keep this open and please report back here when you have longer term results.

@sulkaharo
Copy link
Member

How does the data look now? Also ping: I released 14.0.6 to fix an issue with batch upload of data from Loop, so you might want to update again. This resolves the outstanding big issues so hopefully there won't be more updates in the next few weeks.

@maskac
Copy link
Author

maskac commented Sep 28, 2020

I looking before few minutes ago and i don't know

Upgrade to 14.0.5 was made about 20:00. Peak si backup process. From 8:00 may be growth of load....
image

@sulkaharo
Copy link
Member

If you compare that graph to the one on the top where there was clear 10% increase in the average over 24 hours, it looks to me like at least the bad resource leak is gone

@maskac
Copy link
Author

maskac commented Oct 1, 2020

CPU load & memory load sounds good with 14.0.5
image

Good job, Thanks.

@sulkaharo
Copy link
Member

Great! Btw .5 has a small bug that break Device status uploads for Loop users, fixed in .6, so you might want to update once more. Apologies for that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants