-
-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
High CPU/memory consumption after a short while, zombie process remains after quitting with "Quit" - caused by author autocomplete preference #9952
Comments
Is indexing still running when cpu / ram maxes out? The first run after having upgraded JabRef to a new version usually reindexes your whole library. It is recommended to wait until it is finished, before starting to work. Also, is JabRef's RAM usage lower than your devices max available RAM capabilities? How many entries are we talking about here? Have you tried to follow https://docs.jabref.org/faq#q-i-have-a-huge-library.-what-can-i-do-to-mitigate-performance-issues? If you cannot point to a trigger what exactly causes the undesired behavour, a fix is unlikely. If you are knowledable about coding (or have a friend that has IT background) and are willing and have the time to go deeper, you could set up a local workspace and do some debugging. |
Finding the regression window would also help. Try to find the exact commit that broke JabRef. |
Hi @ThiloteE , thanks a lot for the questions and the FAQ hint. Let me quickly answer your questions as far as possible:
Ok, I'll not be able to engage in coding, but I can try to find the regression window. Also, is there a way to delete the old index and force JR to build up a new one? |
You can rebuilt the index under Tools -> rebuild fulltext search index I would also recommend trying to reset the preferences and check if this has any effect |
+1 JabRef 5.10--2023-06-12--92dcad3
Jabref continues to suck cycles in the background even after program exit. Has to be killed. |
Similar problem here. Consumes lots of CPU and remains even if jabref is closed. |
Perhaps I should add that after installation of jabref_5.9_amd64.deb with QApt in kubuntu 22.10, it worked fine four or five times. But then, after that, it consumes CPU every time a word search is done. Have disabled index searching and other features to no avail. |
still a problem |
Do you have anything in the log files? JabRef writes log files to the file system https://docs.jabref.org/faq/linux#where-can-i-find-jabrefs-log-files |
Thank you for your reply. Just opened jabref, high cpu consumption. closed app, jabref 'orphan' remained (see image). Attached is log.txt file, another log.txt.lock is empty. |
Hey, the other day I repaired my ancient computer from 2012 and learned a lot about hardware, which allowed me to squeeze every little bit of performance out of this old machine. May I ask, if you could provide your hardware details, so that we can get better estimate the scope of the problem?
Source: https://www.binarytides.com/linux-commands-hardware-info/ Edit: obviously, the real problem are the zombie processes, not the hardware! T.T |
@wujastyk your hardware is pretty decent. Clearly not from 2012 and definitely powerful enough for (non-buggy) JabRef and normal office work. I guess we somehow need to debug this. I might have a look tomorrow or one of these days and will try to reproduce on my linux machines. Maybe I find something. |
Here are also my files. Thanks for your help. |
+1 This is my large bib file,(6M): AAA-papers.bib |
I tried to heavily open and close JabRef development version 5.10 from 203-07-25 (not doing anything else) and I do not have zombie processes or high cpu on my old machine from 2012 running Linux FedoraKDE 38, but it was only a small database of maybe 30 entries. I could imagine a database with thousands of entries to behave different or alternatively it must be triggered from something you do while working within JabRef. |
I wonder if my database can be of use to try. I attach it. |
My large bib is here, if it helps. |
Hi everyone, Unfortunately, I can't reproduce the behavior on my machine nor can any of the other maintainers. Our best bet is that someone having the problem could provide us with performance profiling information so we have something to debug. We need a threadump of one of the zombie processes. Also, CPU/Memory samples for when the CPU usage spikes. You can start the sampling process after opening JabRef and stop it after reproducing the high CPU usage behavior. Everything mentioned above can be done with VisualVM. Feel free to include any additional information that VisualVM can generate. |
Thank you for taking the time to look into this problem. |
Hi @mfguasti, Thank you for the quick response. To capture a threadump, you need to launch JabRef and try to reproduce the zombie process behavior. Then go to visualVM and Right-click on JabRef node (the zombie process) in the Applications window and choose Thread Dump. To capture CPU samples, you need to
Capturing memory samples is very similar to CPU, same as the steps above just in step 2 click "Memory" instead of "CPU" to start sampling in VisualVM. |
Hi @HoussemNasri , With great difficulty I managed to get the above files having the jabref zombie running. I am afraid that I am completely unfamiliar with java and with visualvm. It seems to me that you require a more competent collaborator. I could not sample the CPU, visualvm gave me a messsage that something was missing. I hope this limited information helps. |
Hi @mfguasti, Thank you for your cooperation. As an initial analysis, it seems the high CPU consumption is caused by the author's name suggestions when using the autocomplete feature. For some reason, it allocates multiple threads and continues running even after JabRef is closed. To reproduce:
|
Hello everyone, please try this build and see if the problem persist. |
I have just downloaded the debian file from your 'build' link. I will try it now. |
JabRef 5.10-PullRequest10159.1233--2023-08-13--7de82a7 Turned the autocomplete feature again. So far so good! |
Yes, I could reproduce the problem, after several tries, with autocompletion on, using JabRef 5.10--2023-08-05--faa6288 Then I installed the new build, JabRef 5.10-PullRequest10159.1233--2023-08-13--7de82a7 and the behaviour is good, so far. JabRef is not remaining in memory with high CPU, after closing down. There are still some quirks with autocomplete, though. Maybe related, maybe need their own Git issue.
|
This one I'm not sure I understand, feel free to open a new issue for it.
Yeah, I noticed that too, I created an issue here #10161 |
First of all, thanks to all responding to this intricate issue and trying to figure out possible causes. After a while, yesterday, I had again time to update JR, in this case to: JabRef 5.11--2023-09-06--ddbc736 I've had autocompletion switched off (for years now), I'm not using this feature. But I've had indexing switched on, using four files, one incl. about 4500 entries, and I'm using regexp search quite regularly. Unlike hinted at, JR 5.11 does not seem to store logs in .cache/jabref/logs and I couldn't do anything about it, though the log4j output is flushed into the console where I can see stuff like
which might have to do with the logger settings. Anway, indeed, the high CPU consumption had shown up again after a some quarter of an hour:
I then followed the suggestions to switch off indexing (autocomplete was already switched off), to completely reset my configuration (prefs.xml), and make sure that the mentioned features are switched off. The result: The problem seems to have gone for me.
I know, this information is not satisfactory for bugfixing, but I hope it helps to simplify the hunt. |
Thanks, @koppor for pointing me to #8963, which I've subscribed now. Just for the protocol: The problem has shown up again also with 5.11, after having JR running for around half an hour (despite having completely deleted my JR settings and with auto-indexing and -completion switched off). At least after closing JR, my top shows that the process is fully gone. But, nevermind, I'm looking forward to the new search implementation. |
@ytzemih What version of JabRef is this? Are you compiling JabRef from source by any chance? When I use the newest development version, I do not see these error messages in the commandline log. Be aware, JabRef currently uses a customized version of JavaFX, so there are particular steps to follow, if you compile from source on linux. |
@ThiloteE No, I'm using the binary from https://builds.jabref.org/main/jabref_5.11_amd64.deb (currently still from Sep 6 or 7). So, no compilation. Also, I am acting under the assumption that the JR deb just uses the Ubuntu JavaFX package. Do you think, that two JFX versions are interfering here? Thanks, I've been following #8977 a little bit. Works for me as well. My pragmatic solution to the typing performance problem is to clear the search field after my search, and return to entry editor and typing is swift again. It's not a big deal, once you know the cause. |
JabRef version
Latest development branch build (please note build date below)
Operating system
GNU / Linux
Details on version and operating system
JabRef 5.10--2023-05-25--e021686 Linux 5.19.0-42-generic amd64
Checked with the latest development build
Steps to reproduce the behaviour
Since my update from JR 5.9 to 5.10, I'm experiencing that my JR instance (usually with 3 to 4 bigger libraries open) starts to consume 2 of the 4 CPU cores (at 100%) and a lot of memory, after a short while (10-15 minutes). I got aware of that by recognising the CPU fan starting to run high. Moreover, a zombie process remains after exiting with "Quit" from the menu keeping the CPU busy as described. The only help is to kill the process (with top, e.g.).
I don't know whether or how this issue might be related to some of the issues listed there.
Note: I have now switched back to JR 5.9. Opening the same libraries (with autoindexing switched on) and making edits, the problems seems not to occur. So, it seems not to be my specific configuration.
The text was updated successfully, but these errors were encountered: