-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Need a guidance to tune the extension usable for large projects with limited resources #7119
Comments
What is your CPU/memory usage like when the C_Cpp.intelliSenseEngine is set to "Tag Parser"? If you experience problems with that, then that means there is a problem searching for files or parsing files (the database icon in the status bar should indicate if it's searching for files or parsing). Adding paths to files.exclude can help with tag parsing. The workspaceParsingPriority setting just affects tag parsing of the workspace folder. Otherwise, it's a problem with IntelliSense or cpptools-srv -- those processes are based on the currently opened TU and has nothing to do with "large projects", just the size of the current TU (source files plus included headers). It's possible you could be hitting a bug with our parser that is causing too much CPU or memory to be used. We may need some sort of repro or sample code in order to investigate further. |
@sean-mcmanus {
"configurations": [
{
"name": "Linux",
"compileCommands": "${workspaceFolder}/compile_commands.json",
"includePath": [
"${workspaceFolder}/**"
],
"defines": [],
"compilerPath": "/path/to/toolchain/bin/g++ --sysroot=/path/to/toolchain/toolchain-root",
"cStandard": "c11",
"cppStandard": "c++14",
"intelliSenseMode": "gcc-x64",
"browse": {
"path": [
"${workspaceFolder}",
"/path/to/toolchain/toolchain-root/usr/include/"
],
"limitSymbolsToIncludedHeaders": true,
}
}
],
"version": 4
} "C_Cpp.loggingLevel": "Debug",
"C_Cpp.updateChannel": "Insiders",
"C_Cpp.enhancedColorization": "Disabled",
"C_Cpp.workspaceParsingPriority": "low",
"C_Cpp.intelliSenseMemoryLimit": 8192,
"C_Cpp.intelliSenseEngine": "Tag Parser" Most of the time, there is no lagging. Vscode froze only twice. So I am guessing that is mostly due to IntelliSense or cpptools-srv. |
If intelliSenseEngine is "Tag Parser" and there are no cpptools-srv processes, then the issue can't be due to IntelliSense/cpptools-srv. When you set C_Cpp.loggingLevel to "Debug" what does the output say is happening? Do you get a lot of tag parsing messages? Does the logging stop, as if it's stuck? |
@sean-mcmanus I haven't noticed on the output for the logs. |
So you're saying your original issue repros with intelliSenseMode set to "Default"? You mentioned some freezing with intelliSenseMode set to "Tag Parser". The output window should auto scroll if the cursor is at the bottom. We have some docs on how to get more info on performance issues at https://github.com/microsoft/vscode-cpptools/wiki/Troubleshooting-Performance-Issues . If you're seeing cpptools-srv memory usage get too large and having it restart due to the limit, you may be hitting an issue similar to #7085 . |
Yes, my original issue should reproduce with intelliSenseMode set to "Default". |
Yeah, then it sounds like our IntelliSense parser in cpptools-srv is hitting some performance issues. We have one known repro at #7085 , which could potentially share the same root cause. Otherwise, we might need some sort of sample repro to investigate further. If you were to generate a preprocessed file via something like |
Before I just want to switch to the "Default" mode, I got a freezing in the "Tag Parser" mode. I guess that's because after I git fetch and git rebase to update my codebase, the symbol link of |
You can add /home/xx/workspace/yy/build to your files.exclude setting to avoid that processing. |
No, I can't exclude that since some symbols are in the generated But I want to exclude the |
Adding builds to files.exclude should work, but you'd need to make sure the build directory is still somehow added to the includePath or browse.path so symbols are parsed. Do you hit any issues with that? |
Does the following config correctly to exclude all the files and directories under the
And this config is put into settings.json instead of c_cpp_properties.json, right? |
It goes in settings.json (or the settings section of your workspace file). |
OK. Updated the config to |
In the "Tag Parser" mode with above config change, vscode is very smooth. |
There is a crash from
In the log, I see 6 logs like |
Do you see a "crashed" message in the logging? I assume not, which would imply you're hitting a crash on shutdown such as #7161 . If the crash is happening when the process is shutting down, there shouldn't be any negative effect, other than crash dumps being created if your system is configured to do that. |
There is no "crashed" message in the logging. But now I see all 11G memory + 2G swap has been exhausted and there are 3 GUIDs of |
That doesn't sound normal. There is one known case of this at #7085 which we have a fix for our next release. Attaching a debugger to cpptools-srv to get a call stack (https://github.com/microsoft/vscode-cpptools/wiki/Attaching-debugger-to-cpptools-or-cpptools%E2%80%90srv) or getting perf logs (https://github.com/microsoft/vscode-cpptools/wiki/Troubleshooting-Performance-Issues) could determine what the cause is and if it's the same or not. |
Here is the call stack of the core dump. It looks like the same issue as #7161.
|
Yes, thanks for confirming. Also, we don't currently believe that core dump is related to the other memory usage issue you're seeing, although it's possible under high memory usage conditions you could be seeing the IntelliSense process shutdown (intentionally) and restart due to hitting the C_Cpp.intelliSenseMemoryLimit. |
For the performance issue, it's kind of easy to reproduce it. |
Only 1 cpptools-srv is supposed to exist per TU (open source file) up to a per-core max. Running Find All References will also temporarily create them. If you have 28 existing, you may want to kill them or restart your machine if you want to identify which process to get perf data from -- however my guess is that you're actually seeing threads and not processes (so 14-28 threads would be normal). Also, can you try your repro with opening just one of the files? The files should be independent and spawn separate cpptools-srv, so opening 2 files is unlikely to be necessary. FYI, the size of the particular files can be deceivingly small, because it may include project and system headers which can increase the size to hundreds of thousands of lines of code. You can run a command like |
This issue has been closed automatically because it needs more information and has not had recent activity. |
Type: LanguageService
Describe the bug
I have a Ubuntu 18.04 running inside VMware Fusion.
The host is a Macbook Pro which 2.3 GHz Quad-Core Intel Core i5 and 16 GB 2133 MHz LPDDR3.
The virtual machine is assigned 6 cores and 12 GB memory and is just for coding, not for compiling.
Project
The project is large with third-party libraries in a custom toolchain, but targeting regular x86-64 instead of embedded systems.
The legacy building system is CMake and now is based on the Bazel.
Here is the statistics from
cloc
command.This statistics doesn't include the library headers in the toolchain.
The project includes protocol buffer files. In order to make IntelliSense having all the symbols, I usually compile all the
.proto
files by using CMake.It will generate a
compile_commands.json
which includes all the.cc
files and.pb.cc
files (compiled from.proto
).This
compile_commands.json
will be fed into the C++ Extension for IntelliSense.In the root directory of the project, there is a
builds
directory.If you compile in the current branch like
master
, it will generate a directory with config name insidebuilds
directory likebuild-master-debug
.So if you checkout another branch and then compile it, another directory will be generated inside
builds
directory.But another directory
build
is always soft linked to thebuild-xxx-yyy
directory insidebuilds
for the current branch and config.Actually, the
compile_commands.json
is another soft link to the file underbuild
.Also note that the generated
.pb.cc
and.pb.h
will be in thebuild-xxx-yyy
directory.Config
Here is my
c_cpp_properties.json
.Here is part of
settings.json
which is related to this extension.Now in this machine, I just use one branch
master
so that there is only one directory insidebuilds
.Since I found that if I checkout multiple branches, the extension (or language server) will be busy very frequently if a new branch is checked out.
Steps to reproduce
Once I open the VS Code for this project, the extension (or language server) will take very high CPU and memory very frequently regardless which values I choose for
C_Cpp.workspaceParsingPriority
andC_Cpp.intelliSenseMemoryLimit
.Usually, when this happened, the whole virtual machine will freeze and nothing can be done.
Back then, I set the
C_Cpp.workspaceParsingPriority
tohigh
and then I found the system will become freeze finally. But sometimes, it is idle pretty well. Now I just want to find a balanced config which allows me to code in this project and enjoy the convenience of intelliSense and the amount of resource taken is OK and whole system should be responsible all the time. So now I just keep it aslow
.For memory, even I set the limitation to
8192
and whole 12 GB and including 2GB swap will be exhausted sometimes. I also tried to set the limitation to4096
and then I found the GUID of thecpptools-srv
would frequently changed. Sometimes, multiple GUIDs will show up in thehtop
result. I don't know whether that is by design or not. I guess since the memory limitation is reached, it is retrying to kill the old instance and bridge up a new instance. But because of the scope of the project, the memory limitation is reached soon and this pattern repeated.Expected behavior
Firstly, I want suggestions to make extension usable in my case. Is there any settings could make the language server not work that aggressively? maybe like throttling it only actively some number of files? It's better to make language server running under those kinds of limitation instead of just kill it if the resource it takes exceeds the limitation.
And the extension community should have some guidance or steps to guild all the users to tune the config to make the extension usable for a large project since I see a lot of similar issues here.
And in order to do that, the community should also have some performance numbers showing what kind of project scope needs at least how many resources to make the extension work smoothly. So that users won't waste time to tune it if the system resource is lower than the lower bound.
Plus it's better the extension could expose some statistics to show the status lively like how many files are being analyzed and how many in total, etc. So if some config items are changed, the user could observe this statistics to see whether this extension is working under some expected limitation.
Screenshots
Here is a screenshot of
htop
.The text was updated successfully, but these errors were encountered: