Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High cpu/gpu usage when window is minimized for example_win32_directx11 #2496

Closed
evga opened this issue Apr 15, 2019 · 7 comments
Closed

High cpu/gpu usage when window is minimized for example_win32_directx11 #2496

evga opened this issue Apr 15, 2019 · 7 comments
Labels

Comments

@evga
Copy link

evga commented Apr 15, 2019

Version/Branch of Dear ImGui:

Version: 1.69
Branch: master

Back-end/Renderer/Compiler/OS

Back-ends: imgui_impl_directx11.cpp + imgui_impl_win32.cpp
Compiler: Visual Studio Community 2019
Operating System: Windows10

My Issue/Question:

Open the Directx11 example application and minimize the window.

CPU goes from 8% to 50%
GPU goes from 30% to 70%

This doesn't happen with the opengl/vulkan/dx9 examples.

ocornut added a commit that referenced this issue Apr 15, 2019
@ocornut
Copy link
Owner

ocornut commented Apr 15, 2019

I looks like when the surface is minimized, Present() doesn't wait for vsync and so the code turns at the maximum possible framerate on that core (*).

It doesn't seem to happen with the DX12 example possibly because IDXGISwapChain3 (used in dx12) behave differently than the IDXGISwapChain we use? This needs to be investigated.

In our case in the DX11 example, Present() returns HRESULT DXGI_STATUS_OCCLUDED: 0x087a0001 : The Present operation was invisible to the user. and returns without waiting. I do not know if there is a setting to request the driver to wait or if we need to be waiting by our own mean.

(*) there was actually a missing early out in the DX10/11/12 render code, which I just pushed. When added my mainloop runs went from 3500 FPS to 7500 FPS here. That's not solving your problem at all but it is more correct to early out there.

@evga
Copy link
Author

evga commented Apr 15, 2019

I have no experience with this stuff but looking at the documentation I found this:

IDXGISwapChain1::Present1 will inform you if your output window is entirely occluded via DXGI_STATUS_OCCLUDED. When this occurs, we recommended that your application go into standby mode (by calling IDXGISwapChain1::Present1 with DXGI_PRESENT_TEST) since resources used to render the frame are wasted. Using DXGI_PRESENT_TEST will prevent any data from being presented while still performing the occlusion check. Once IDXGISwapChain1::Present1 returns S_OK, you should exit standby mode; do not use the return code to switch to standby mode as doing so can leave the swap chain unable to relinquish full-screen mode.

Hope it helps.

@evga
Copy link
Author

evga commented Apr 15, 2019

I've searched alot and I believe the application has to do the sleeping and keep testing using the DXGI_PRESENT_TEST flag until it works again, something like this ...

static UINT presentFlags = 0;

if (g_pSwapChain->Present(1, presentFlags) == DXGI_STATUS_OCCLUDED) {
    presentFlags = DXGI_PRESENT_TEST;
    Sleep(20);
} else {
    presentFlags = 0;
}

As for why it works with DX12 I have absolutly no idea :(

@ocornut
Copy link
Owner

ocornut commented Apr 22, 2019

While there are several ways to fix it easily for master,
This is trickier to solve in a multi-viewport context (#1542).

I have pushed a small fix for secondary viewport to avoid rendering/swapping minimized windows, but we are left with the problem that if you minimize the main viewport (the one handled by main.cpp) but have secondary viewports showing, there will be nothing left to throttle the application. Secondary viewports are currently programmed to present without vsync, assuming the main app with vsync.

Currently imgui_impl_win32.cpp in Docking branch defaults to creating a parent<>child relationship between viewports so minimizing the main window also minimize others, so this issue is not noticeable. However if you set io.ConfigViewportsNoDefaultParent = true to break with parent<>child relationship (which is useful to implement other features) then you can minimize the main viewport, and you'll find that the other viewports are burning a CPU rendering as fast as possible without vsync.

This also related to the feature discussed in #2471. If we support zero or more than one main viewport we may need to have a better standard or mechanism to describe how the main loop should handle timing and vsyncing.

@Plutoberth
Copy link

I experienced this issue with my application too. I think that it could be beneficial if the code that evga posted will be integrated in the example (even if it's commented out), or a more sophisticated solution based on the messages passed to the app. I know that it's used for performance testing, but I think that many beginners copy-paste the examples and replace the imgui calls with their own (I know I did!). What do you think about it, @ocornut ? Let me know if it's appropriate and I'll create a PR.

@mirh
Copy link

mirh commented Jun 2, 2021

#3907 ?

@ocornut
Copy link
Owner

ocornut commented May 23, 2024

I have pushed ec1d2be which seemingly solves it for both cases of minimization and screen locking. Implemented this for the DX9, DX10, DX11 and DX12 examples. Both DX9 and DX12 weren't burning as much CPU when looping with no visible swap chain, as Present() would still honor vsync, but it's still best to not loop when locked.

Note this issue for multi-viewports #7615 (comment)

Thanks for the help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants