Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

renderer/glimp: new GL detection and selection code #478

Merged
merged 2 commits into from
Feb 23, 2022

Conversation

illwieckz
Copy link
Member

@illwieckz illwieckz commented Jun 6, 2021

Update: 2022-02-16

This PR is low priority (existing code works). This fix many bugs, this is now highly wanted.

The effort was initially motivated by #477 to provide a new GL selection code that first detects the higher GL version supported before trying any custom configuration, and if no custom configuration (or it fails), loads the higher GL version possible. This way /gfxinfo displays the higher GL supported by the card (Nvidia proprietary driver rewrites the GL and GLSL version strings with currently requested GL version).

With default configuration, the code only tries OpenGL 3.2 core and 2.1 compatibility for success (and all versions below on error to provide user a meaningful error message). By setting r_glExtendedValidation to 1, an user can also detect (at the cost of a slower startup) the best OpenGL profile his hardware and software support, by iterating all known OpenGL versions, which may produce more useful logs on demand.

Subsequent vid_restart calls are also now faster, by skipping validation test that was already done before. There are also more reuse of window and context when possible.

Some bugs were also fixed in the meantime, like fullscreen not being flagged as borderless while borderless option is enabled, meaning switching from fullscreen to windowed would provide user a bordered window instead of a borderless window.

Most error messages are now more meaningful.

Diving in the code and rewriting it also uncovered some unused variables that were not detected as unused because of the code convolution. Unless I did a mistake, the value of r_ext_multisample and r_alphabits appears to be unused. The first one may explain why I haven't seen multisampling working in years… And also some bugs were caught and fixed, including crashes. In the end, this PR is more about making the code more robust and to fix bugs than to improve logging, logging is just made better in the process (and is easier to make it better).

An extra commit enables extra compatibility in Intel drivers for some Intel cards on Linux (at this point, it helps to have better meaningful error messages).

About logging, It looks like there is no way to know the available GL profiles and versions without testing all of them. Even glxinfo does it. This is because you need to set the version to create a context, and you need the context to be created to query the version… Hence the need for looping every GL version and profile to know what's the best supported ones. 🤦‍♀️


sdl_glimp,tr_init: rewrite the original GL selection code, improve gfxinfo

This commit squashes multiple commits by illwieckz and slipher.

Squashed commits by illwieckz

sdl_glimp: rewrite the original GL selection code

  • Detect best configuration possible.
  • Try custom configuration if exists.
  • If no custom configuration or if it fails,
    load the recommended configuration if
    possible (OpenGL 3.2 core) or the best
    one available.
  • Reuse window and context when possible.
  • Display meaningful popup telling user
    OpenGL version is too low or required
    extensions are missing when that happens.
  • Rely on more return codes for GLimp_SetMode().
  • Test for negative SDL_Init return value,
    not just -1.

sdl_glimp,tr_init: do not test all OpenGL versions unless r_glExtendedValidation is set

When r_glExtendedValidation is enabled, the engine
tests if OpenGL versions higher than 3.2 core
are supported for logging and diagnostic purpose,
but still requestes 3.2 core anyway. Some drivers
may provide more than 3.2 when requesting 3.2, this
is not our fault.

sdl_glimp,tr_init: rewrite logging, improve gfxinfo

  • Move GL query for logging purpose from tr_init to sdl_glimp.
  • Do not split MODE log message.
  • Unify some log.
  • Add more debug log when building GL extension list.
  • Also increase the extensions_string length to not
    truncate the string, 4096 is not enough, there can
    be more than 6000 characters on an OpenGL 4.6 driver.
  • Also log missing extensions to make gfxinfo more useful.
  • Rewrite gfxinfo in more useful way.
  • List enabled and missing GL extensions.

sdl_glimp: silence the GL error when querying if context

is core on non-core implementation

  • Silence the error that may happen when querying if the
    OpenGL context uses core profile when core profile is not
    supported by the OpenGL implementation to begin with.

For example this may happen on implementations not supporting
higher than OpenGL 2.1, while forcing OpenGL 2.1 on
implementations supporting higher versions including
core profiles may not raise an error.

sdl_glimp: make GLimp_StartDriverAndSetMode only return true on RSERR_OK

  • Only return true on OK, don't return true on unknown errors.

sdl_glimp: catch errors from GLimp_DetectAvailableModes to prevent further segfault

It may be possible to create a valid context that is unusable.

For example the 3840×2160 resolution is too large for the
Radeon 9700 and the Mesa r300 driver may print this error
when the requested resolution is higher than what is supported
by hardware:

r300: Implementation error: Render targets are too big in r300_set_framebuffer_state, refusing to bind framebuffer state!

It will unfortunately return a valid but unusable context that will
make the engine segfault when calling GL_SetDefaultState().

sdl_glimp: flag fullscreen window as borderless when borderless is enabled

Flag fullscreen window as borderless when borderless is enabled
otherwise the window will be bordered when leaving fullscreen
while the borderless option would be enabled.

sdl_glimp,tr_init: remove unused depthBits

sdl_glimp: remove SDL_INIT_NOPARACHUTE

In GLimp_StartDriverAndSetMod() the SDL_INIT_NOPARACHUTE flag was
removed from SDL_Init( SDL_INIT_VIDEO ) as this flag is now
ignored, see https://wiki.libsdl.org/SDL_Init

Squashed commits by slipher

Better type safety in GL detection code

Simplify GL_ValidateBestContext duplicate loop

GLimp_SetMode - simplify error handling

Rework GL initialization

Have GLimp_ValidateBestContext() validate both the highest-numbered
available context (if r_glExtendedValidation is enabled), and the
highest context that we actually want to use (at most 3.2). This means
most of the code in GLimp_ApplyPreferredOptions can be removed because
it was duplicating the knowledge about version preferences and the code
for instantiating them in GLimp_ValidateBestContext.

Also cut down on other code duplication.

Remove dead cvar r_stencilBits

Fix glConfig.colorBits logging

  • show actual not requested
  • don't log it twice at notice level

system: enable GL 2.1 on Intel Gen 3 hardware, <3 papap

This may not be enough to make the game run on such
hardware anyway.

The Mesa i915 driver for GMA Gen 3 disabled GL 2.1 on such
hardware to force Google Chrome to use its CPU fallback
that was faster but we don't implement such fallback.
See https://gitlab.freedesktop.org/mesa/mesa/-/commit/a1891da7c865c80d95c450abfc0d2bc49db5f678

Only Mesa i915 on Linux supports GL 2.1 for GMA Gen 3,
so there is no similar tweak being required for Windows
and macOS.

Enabling those options would at least make the engine
properly report missing extension instead of missing
GL version, for example the Intel GMA 3100 G33 (Gen 3)
will report missing GL_ARB_half_float_vertex extension
instead of missing OpenGL 2.1 version.

The GMA 3150 is known to have wider OpenGL support than
GMA 3100, for example it has OpenGL version similar to
GMA 4 on Windows while being a GMA 3 so the list of
available GL extensions may be larger.

Thanks papap and its LanPower association for the kind
report and availability for testing.
http://asso.lanpower.free.fr

@illwieckz illwieckz added T-Improvement Improvement for an existing feature A-Renderer labels Jun 6, 2021
@illwieckz
Copy link
Member Author

Note: on this page of Khronos wiki I have read:

Warning: Unfortunately, Windows does not allow the user to change the pixel format of a window. You get to set it exactly once. Therefore, if you want to use a different pixel format from the one your fake context used (for sRGB or multisample framebuffers, or just different bit-depths of buffers), you must destroy the window entirely and recreate it after we are finished with the dummy context.

And in that code, I never destroy a windows. when destroying a context before creating another. The thing is, with SDL, you create a context from a window, so at no time we set the pixel format when creating the window, it is set when creating the context from the window which already exists. So I would like to see some Windows user testing this code on Windows.

Assuming Windows would just create 2-bit window by default, it's also possible that trying to create a 16-bit context on a 24-bit window will just fail, which would be very OK since the next attempt (24-bit context) will succeed. But better test it on Windows.

@illwieckz
Copy link
Member Author

illwieckz commented Jun 6, 2021

Before:

/gfxinfo
GL_VENDOR: NVIDIA Corporation
GL_RENDERER: Quadro K1100M/PCIe/SSE2
GL_VERSION: 3.2.0 NVIDIA 340.108
GL_SHADING_LANGUAGE_VERSION: 1.50 NVIDIA via Cg compiler
GL_MAX_VERTEX_UNIFORM_COMPONENTS 4096
Using OpenGL 3.x context
Using GPU vertex skinning with max 233 bones in a single pass

After:

/gfxinfo
GL_VENDOR: NVIDIA Corporation 
GL_RENDERER: Quadro K1100M/PCIe/SSE2 
GL_VERSION: 4.4.0 NVIDIA 340.108 
GL_SHADING_LANGUAGE_VERSION: 4.40 NVIDIA via Cg compiler 
GL_MAX_VERTEX_UNIFORM_COMPONENTS 4096 
Using OpenGL 3.x context 
Having a core profile 
Using GPU vertex skinning with max 233 bones in a single pass 

When forcing configuration:

/r_glMajorVersion 3 ; r_glMinorVersion 1 ; vid_restart
/gfxinfo
GL_VENDOR: NVIDIA Corporation 
GL_RENDERER: Quadro K1100M/PCIe/SSE2 
GL_VERSION: 3.1.0 NVIDIA 340.108 
GL_SHADING_LANGUAGE_VERSION: 1.40 NVIDIA via Cg compiler 
GL_MAX_VERTEX_UNIFORM_COMPONENTS 4096 
Using GPU vertex skinning with max 233 bones in a single pass 
/r_glMajorVersion 3 ; r_glMinorVersion 2 ; vid_restart
/gfxinfo
GL_VENDOR: NVIDIA Corporation 
GL_RENDERER: Quadro K1100M/PCIe/SSE2 
GL_VERSION: 3.2.0 NVIDIA 340.108 
GL_SHADING_LANGUAGE_VERSION: 1.50 NVIDIA via Cg compiler 
GL_MAX_VERTEX_UNIFORM_COMPONENTS 4096 
Using OpenGL 3.x context 
Having a core profile 
Using GPU vertex skinning with max 233 bones in a single pass 
/r_glMajorVersion 4 ; r_glMinorVersion 1 ; vid_restart
/gfxinfo
GL_VENDOR: NVIDIA Corporation 
GL_RENDERER: Quadro K1100M/PCIe/SSE2 
GL_VERSION: 4.1.0 NVIDIA 340.108 
GL_SHADING_LANGUAGE_VERSION: 4.10 NVIDIA via Cg compiler 
GL_MAX_VERTEX_UNIFORM_COMPONENTS 4096 
Using OpenGL 3.x context 
Having a core profile 
Using GPU vertex skinning with max 233 bones in a single pass 
/r_glMajorVersion "" ; r_glMinorVersion "" ; r_glProfile compat ; vid_restart
/gfxinfo
GL_VENDOR: NVIDIA Corporation 
GL_RENDERER: Quadro K1100M/PCIe/SSE2 
GL_VERSION: 4.4.0 NVIDIA 340.108 
GL_SHADING_LANGUAGE_VERSION: 4.40 NVIDIA via Cg compiler 
GL_MAX_VERTEX_UNIFORM_COMPONENTS 4096 
Using OpenGL 3.x context 
Having a compatibility profile 
Using GPU vertex skinning with max 233 bones in a single pass 

@illwieckz
Copy link
Member Author

Also:

/r_glMajorVersion 1 ; r_glMinorVersion 3 ; vid_restart
The change to r_glMajorVersion will take effect after restart. 
The change to r_glMinorVersion will take effect after restart. 
Debug: Shutting down OpenGL subsystem 
Debug: Destroying 1920×1080 SDL window at 0,27 
Calling GetRefAPI… 
SDL_Init( SDL_INIT_VIDEO )...  
Using SDL Version 2.0.10 
SDL using driver "x11" 
Initializing OpenGL display 
Display aspect: 1.778 
...setting mode -1: 
 1920 1080 
Debug: SDL borderless window created at 0,0 with 1920×1080 size 
Debug: Valid context: 16-bit GL 2.1 compat 
Debug: Valid context: 24-bit GL 2.1 compat 
Debug: Invalid context: 16-bit GL 2.2 compat 
Debug: Invalid context: 24-bit GL 2.2 compat 
Debug: Valid context: 16-bit GL 3.0 compat 
Debug: Valid context: 24-bit GL 3.0 compat 
Debug: Valid context: 16-bit GL 3.1 compat 
Debug: Valid context: 24-bit GL 3.1 compat 
Debug: Valid context: 16-bit GL 3.2 core 
Debug: Valid context: 24-bit GL 3.2 core 
Debug: Valid context: 16-bit GL 3.3 core 
Debug: Valid context: 24-bit GL 3.3 core 
Debug: Invalid context: 16-bit GL 3.4 core 
Debug: Invalid context: 24-bit GL 3.4 core 
Debug: Valid context: 16-bit GL 4.0 core 
Debug: Valid context: 24-bit GL 4.0 core 
Debug: Valid context: 16-bit GL 4.1 core 
Debug: Valid context: 24-bit GL 4.1 core 
Debug: Valid context: 16-bit GL 4.2 core 
Debug: Valid context: 24-bit GL 4.2 core 
Debug: Valid context: 16-bit GL 4.3 core 
Debug: Valid context: 24-bit GL 4.3 core 
Debug: Valid context: 16-bit GL 4.4 core 
Debug: Valid context: 24-bit GL 4.4 core 
Debug: Invalid context: 16-bit GL 4.5 core 
Debug: Invalid context: 24-bit GL 4.5 core 
Debug: Invalid context: 16-bit GL 5.0 core 
Debug: Invalid context: 24-bit GL 5.0 core 
Best context: 24-bit GL 4.4 core 
Warn: OpenGL 1.3 is not supported, trying 4.4 instead 
Debug: Created best context: 24-bit GL 4.4 core 
Using 24 Color bits, 0 depth, 8 stencil display. 
Using GLEW 2.1.0 
Using enhanced (GL3) Renderer in GL 3.x mode... 
Available modes: '640x360 720x405 864x486 960x540 1024x576 1280x720 1600x900 1920x1080 1368x768 1360x768 1280x800 1440x900 1680x1050 1600x1024 1400x900 640x480 800x600 1024x768 1152x864 1280x960 1400x1050 1280x1024 ' 
GL_RENDERER: Quadro K1100M/PCIe/SSE2 
Detected graphics driver class 'OpenGL 3+' 
Detected graphics hardware class 'generic' 
Initializing OpenGL extensions 
...ignoring GL_ARB_debug_output 
...found shading language version 440 
...using GL_ARB_half_float_pixel 
...using GL_ARB_texture_float 
...using GL_EXT_gpu_shader4 
...using GL_EXT_texture_integer 
...using GL_ARB_texture_rg 
...found buggy Nvidia 340 driver 
...ignoring GL_ARB_texture_gather 
...using GL_EXT_texture_compression_s3tc 
...using GL_ARB_texture_compression_rgtc 
...using GL_EXT_texture_filter_anisotropic 
...using GL_ARB_half_float_vertex 
...using GL_ARB_framebuffer_object 
...using GL_ARB_get_program_binary 
...using GL_ARB_buffer_storage 
...using GL_ARB_uniform_buffer_object 
...using GL_ARB_map_buffer_range 
...using GL_ARB_sync 
glsl shaders took 123 msec to build 
Warn: Couldn't find image file 'flareShader' 
Warn: Couldn't find image file 'sun' 
GL_VENDOR: NVIDIA Corporation 
GL_RENDERER: Quadro K1100M/PCIe/SSE2 
GL_VERSION: 4.4.0 NVIDIA 340.108 
GL_SHADING_LANGUAGE_VERSION: 4.40 NVIDIA via Cg compiler 
GL_MAX_VERTEX_UNIFORM_COMPONENTS 4096 
Using OpenGL 3.x context 
Having a core profile 
Using GPU vertex skinning with max 233 bones in a single pass 

And:

/r_glMajorVersion 4 ; r_glMinorVersion 6 ; vid_restart
The change to r_glMajorVersion will take effect after restart. 
The change to r_glMinorVersion will take effect after restart. 
Debug: Shutting down OpenGL subsystem 
Debug: Destroying 1920×1080 SDL window at 0,27 
Calling GetRefAPI… 
SDL_Init( SDL_INIT_VIDEO )...  
Using SDL Version 2.0.10 
SDL using driver "x11" 
Initializing OpenGL display 
Display aspect: 1.778 
...setting mode -1: 
 1920 1080 
Debug: SDL borderless window created at 0,0 with 1920×1080 size 
Debug: Valid context: 16-bit GL 2.1 compat 
Debug: Valid context: 24-bit GL 2.1 compat 
Debug: Invalid context: 16-bit GL 2.2 compat 
Debug: Invalid context: 24-bit GL 2.2 compat 
Debug: Valid context: 16-bit GL 3.0 compat 
Debug: Valid context: 24-bit GL 3.0 compat 
Debug: Valid context: 16-bit GL 3.1 compat 
Debug: Valid context: 24-bit GL 3.1 compat 
Debug: Valid context: 16-bit GL 3.2 core 
Debug: Valid context: 24-bit GL 3.2 core 
Debug: Valid context: 16-bit GL 3.3 core 
Debug: Valid context: 24-bit GL 3.3 core 
Debug: Invalid context: 16-bit GL 3.4 core 
Debug: Invalid context: 24-bit GL 3.4 core 
Debug: Valid context: 16-bit GL 4.0 core 
Debug: Valid context: 24-bit GL 4.0 core 
Debug: Valid context: 16-bit GL 4.1 core 
Debug: Valid context: 24-bit GL 4.1 core 
Debug: Valid context: 16-bit GL 4.2 core 
Debug: Valid context: 24-bit GL 4.2 core 
Debug: Valid context: 16-bit GL 4.3 core 
Debug: Valid context: 24-bit GL 4.3 core 
Debug: Valid context: 16-bit GL 4.4 core 
Debug: Valid context: 24-bit GL 4.4 core 
Debug: Invalid context: 16-bit GL 4.5 core 
Debug: Invalid context: 24-bit GL 4.5 core 
Debug: Invalid context: 16-bit GL 5.0 core 
Debug: Invalid context: 24-bit GL 5.0 core 
Best context: 24-bit GL 4.4 core 
Debug: GL version 4.6 is forced by r_MajorVersion and r_MinorVersion 
Warn: Failed custom context: 24-bit GL 4.6 core 
Warn: SDL_GL_CreateContext failed: Could not create GL context: BadMatch (invalid parameter attributes) 
Debug: Created best context: 24-bit GL 4.4 core 
Using 24 Color bits, 0 depth, 8 stencil display. 
Using GLEW 2.1.0 
Using enhanced (GL3) Renderer in GL 3.x mode... 
Available modes: '640x360 720x405 864x486 960x540 1024x576 1280x720 1600x900 1920x1080 1368x768 1360x768 1280x800 1440x900 1680x1050 1600x1024 1400x900 640x480 800x600 1024x768 1152x864 1280x960 1400x1050 1280x1024 ' 
GL_RENDERER: Quadro K1100M/PCIe/SSE2 
Detected graphics driver class 'OpenGL 3+' 
Detected graphics hardware class 'generic' 
Initializing OpenGL extensions 
...ignoring GL_ARB_debug_output 
...found shading language version 440 
...using GL_ARB_half_float_pixel 
...using GL_ARB_texture_float 
...using GL_EXT_gpu_shader4 
...using GL_EXT_texture_integer 
...using GL_ARB_texture_rg 
...found buggy Nvidia 340 driver 
...ignoring GL_ARB_texture_gather 
...using GL_EXT_texture_compression_s3tc 
...using GL_ARB_texture_compression_rgtc 
...using GL_EXT_texture_filter_anisotropic 
...using GL_ARB_half_float_vertex 
...using GL_ARB_framebuffer_object 
...using GL_ARB_get_program_binary 
...using GL_ARB_buffer_storage 
...using GL_ARB_uniform_buffer_object 
...using GL_ARB_map_buffer_range 
...using GL_ARB_sync 
glsl shaders took 125 msec to build 
Warn: Couldn't find image file 'flareShader' 
Warn: Couldn't find image file 'sun' 
GL_VENDOR: NVIDIA Corporation 
GL_RENDERER: Quadro K1100M/PCIe/SSE2 
GL_VERSION: 4.4.0 NVIDIA 340.108 
GL_SHADING_LANGUAGE_VERSION: 4.40 NVIDIA via Cg compiler 
GL_MAX_VERTEX_UNIFORM_COMPONENTS 4096 
Using OpenGL 3.x context 
Having a core profile 
Using GPU vertex skinning with max 233 bones in a single pass 

Copy link
Member

@slipher slipher left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the motivation of reuse already created SDL window?

I still have not reviewed rewrite the original GL selection code.

src/engine/sys/sdl_glimp.cpp Outdated Show resolved Hide resolved
src/engine/sys/sdl_glimp.cpp Outdated Show resolved Hide resolved
src/engine/sys/sdl_glimp.cpp Outdated Show resolved Hide resolved
src/engine/renderer/tr_init.cpp Outdated Show resolved Hide resolved
src/engine/sys/sdl_glimp.cpp Outdated Show resolved Hide resolved
@illwieckz
Copy link
Member Author

illwieckz commented Jun 29, 2021

One thing I would like to see is to see that code being tested on Windows. And if that's still possible to set a 16 bit display on Windows 10, tested on a 16 bit Windows display as well.

@illwieckz
Copy link
Member Author

What is the motivation of reuse already created SDL window?

To avoid the engine cycling by creating and destroying and creating again a window when testing various configuration, potentially triggering window manager animations. Before that PR we would have tested 4 configuration in worst situation, after this PR the code is expected to test all configurations that works until the highest is found, so, we may not want to create and destroy 10 windows in a row…

We need a window being already created to create a GL context, and we need to create a context to test a configuration, but we don't need to create a new window each time we need to create a GL context to test if such context is valid.

@illwieckz
Copy link
Member Author

That push only fixed the unnecessary .c_str().

This may not be enough to make the game run on such
hardware anyway.

The Mesa i915 driver for GMA Gen 3 disabled GL 2.1 on such
hardware to force Google Chrome to use its CPU fallback
that was faster but we don't implement such fallback.
See https://gitlab.freedesktop.org/mesa/mesa/-/commit/a1891da7c865c80d95c450abfc0d2bc49db5f678

Only Mesa i915 on Linux supports GL 2.1 for GMA Gen 3,
so there is no similar tweak being required for Windows
and macOS.

Enabling those options would at least make the engine
properly report missing extension instead of missing
GL version, for example the Intel GMA 3100 G33 (Gen 3)
will report missing GL_ARB_half_float_vertex extension
instead of missing OpenGL 2.1 version.

The GMA 3150 is known to have wider OpenGL support than
GMA 3100, for example it has OpenGL version similar to
GMA 4 on Windows while being a GMA 3 so the list of
available GL extensions may be larger.

Thanks papap and its LanPower association for the kind
report and availability for testing.
http://asso.lanpower.free.fr
…xinfo

This commit squashes multiple commits by illwieckz and slipher.

See #478

Co-authored-by: Thomas “illwieckz” Debesse <[email protected]>
Co-authored-by: slipher <[email protected]>

== Squashed commits by illwieckz

:: sdl_glimp: rewrite the original GL selection code

- Detect best configuration possible.
- Try custom configuration if exists.
- If no custom configuration or if it fails,
  load the recommended configuration if
  possible (OpenGL 3.2 core) or the best
  one available.
- Reuse window and context when possible.
- Display meaningful popup telling user
  OpenGL version is too low or required
  extensions are missing when that happens.
- Rely on more return codes for GLimp_SetMode().
- Test for negative SDL_Init return value,
  not just -1.

:: sdl_glimp,tr_init: do not test all OpenGL versions unless r_glExtendedValidation is set

When r_glExtendedValidation is enabled, the engine
tests if OpenGL versions higher than 3.2 core
are supported for logging and diagnostic purpose,
but still requestes 3.2 core anyway. Some drivers
may provide more than 3.2 when requesting 3.2, this
is not our fault.

:: sdl_glimp,tr_init: rewrite logging, improve gfxinfo

- Move GL query for logging purpose from tr_init to sdl_glimp.
- Do not split MODE log message.
- Unify some log.
- Add more debug log when building GL extension list.
- Also increase the extensions_string length to not
  truncate the string, 4096 is not enough, there can
  be more than 6000 characters on an OpenGL 4.6 driver.
- Also log missing extensions to make gfxinfo more useful.
- Rewrite gfxinfo in more useful way.
- List enabled and missing GL extensions.

:: sdl_glimp: silence the GL error when querying if context
is core on non-core implementation

- Silence the error that may happen when querying if the
  OpenGL context uses core profile when core profile is not
  supported by the OpenGL implementation to begin with.

For example this may happen on implementations not supporting
higher than OpenGL 2.1, while forcing OpenGL 2.1 on
implementations supporting higher versions including
core profiles may not raise an error.

:: sdl_glimp: make GLimp_StartDriverAndSetMode only return true on RSERR_OK

- Only return true on OK, don't return true on unknown errors.

:: sdl_glimp: catch errors from GLimp_DetectAvailableModes to prevent further segfault

It may be possible to create a valid context that is unusable.

For example the 3840×2160 resolution is too large for the
Radeon 9700 and the Mesa r300 driver may print this error
when the requested resolution is higher than what is supported
by hardware:

> r300: Implementation error: Render targets are too big in r300_set_framebuffer_state, refusing to bind framebuffer state!

It will unfortunately return a valid but unusable context that will
make the engine segfault when calling GL_SetDefaultState().

:: sdl_glimp: flag fullscreen window as borderless when borderless is enabled

Flag fullscreen window as borderless when borderless is enabled
otherwise the window will be bordered when leaving fullscreen
while the borderless option would be enabled.

:: sdl_glimp,tr_init: remove unused depthBits

:: sdl_glimp: remove SDL_INIT_NOPARACHUTE

In GLimp_StartDriverAndSetMod() the SDL_INIT_NOPARACHUTE flag was
removed from SDL_Init( SDL_INIT_VIDEO ) as this flag is now
ignored, see https://wiki.libsdl.org/SDL_Init

== Squashed commits by slipher

:: Better type safety in GL detection code

:: Simplify GL_ValidateBestContext duplicate loop

:: GLimp_SetMode - simplify error handling

:: Rework GL initialization

Have GLimp_ValidateBestContext() validate both the highest-numbered
available context (if r_glExtendedValidation is enabled), and the
highest context that we actually want to use (at most 3.2). This means
most of the code in GLimp_ApplyPreferredOptions can be removed because
it was duplicating the knowledge about version preferences and the code
for instantiating them in GLimp_ValidateBestContext.

:: Also cut down on other code duplication.

:: Remove dead cvar r_stencilBits

:: Fix glConfig.colorBits logging

- show actual not requested
- don't log it twice at notice level
@illwieckz illwieckz force-pushed the illwieckz/gldetect branch 2 times, most recently from a1654eb to ebbfc1b Compare July 14, 2021 15:09
@illwieckz
Copy link
Member Author

I added a new commit setting some environment variable to enable GL 2.1 on Intel GMA Gen 3 on Linux.

The Mesa i915 driver for GMA Gen 3 disabled GL 2.1 on such hardware to force Google Chrome to use its CPU fallback that was faster but we don't implement such fallback.

See https://gitlab.freedesktop.org/mesa/mesa/-/commit/a1891da7c865c80d95c450abfc0d2bc49db5f678

Only Mesa i915 on Linux supports GL 2.1 for GMA Gen 3, so there is no similar tweak being required for Windows and macOS.

Mesa i915 and macOS also supports GL 2.1 on GMA Gen 4 while windows drivers don't but those tweaks are not required as the related features are enabled by default.

First Intel hardware range expected to have drivers supporting GL 2.1 on Windows is GMA Gen 5.

Enabling those options would at least make the engine properly report missing extension instead of missing GL version, for example the Intel GMA 3100 G33 (Gen 3) will report missing GL_ARB_half_float_vertex extension instead of missing OpenGL 2.1 version.

The GMA 3150 is known to have wider OpenGL support than GMA 3100, for example it has OpenGL version similar to GMA 4 on Windows while being a GMA 3 so the list of available GL extensions may be larger. Though I have not put my hand on it.

Thanks papap and its LanPower association ( http://asso.lanpower.free.fr ) for the kind report and availability for testing.

Before:

Debug: ----- R_Init -----
SDL_Init( SDL_INIT_VIDEO )... 
Using SDL Version 2.0.10
SDL using driver "x11"
Initializing OpenGL display
Display aspect: 1.250
...setting mode -1: 640×480
Debug: SDL borderless window created at 320,272 with 640×480 size
Debug: Invalid context: 16-bit GL 2.1 compat
Debug: Invalid context: 24-bit GL 2.1 compat
Debug: Invalid context: 16-bit GL 3.0 compat
Debug: Invalid context: 24-bit GL 3.0 compat
Best context: 16-bit GL 0.0 compat
Debug: Destroying 640×480 SDL window at 320,272
Warn: OpenGL is too old.

You need a graphic card with drivers supporting at least OpenGL 3.2
or OpenGL 2.1 with ARB_half_float_vertex and ARB_framebuffer_object.
Debug: ----- CL_Shutdown -----

After:

Debug: ----- R_Init -----
SDL_Init( SDL_INIT_VIDEO )... 
Using SDL Version 2.0.10
ATTENTION: default value of option fragment_shader overridden by environment.
ATTENTION: default value of option stub_occlusion_query overridden by environment.
SDL using driver "x11"
Initializing OpenGL display
Display aspect: 1.250
...setting mode -1: 640×480
Debug: SDL borderless window created at 320,272 with 640×480 size
Debug: Valid context: 16-bit GL 2.1 compat
Debug: Valid context: 24-bit GL 2.1 compat
Debug: Invalid context: 16-bit GL 2.2 compat
Debug: Invalid context: 24-bit GL 2.2 compat
Debug: Invalid context: 16-bit GL 3.0 compat
Debug: Invalid context: 24-bit GL 3.0 compat
Best context: 24-bit GL 2.1 compat
Debug: Created best context: 24-bit GL 2.1 compat
Using 24 Color bits, 0 depth, 8 stencil display.
Using GLEW 2.1.0
Using GL3 Renderer in GL 2.x mode...
Available modes: '1280x1024 640x480 800x600 1024x768 '
GL_RENDERER: Mesa DRI Intel(R) G33 
Detected graphics driver class 'integrated'
Detected graphics hardware class 'generic'
Initializing OpenGL extensions
...ignoring GL_ARB_debug_output
...found shading language version 120
...using GL_ARB_half_float_pixel
...GL_ARB_texture_float not found
...GL_EXT_gpu_shader4 not found
...GL_EXT_texture_integer not found
...GL_ARB_texture_rg not found
...GL_ARB_texture_gather not found
...using GL_EXT_texture_compression_s3tc
...GL_ARB_texture_compression_rgtc not found
...using GL_EXT_texture_filter_anisotropic
Warn: Required extension GL_ARB_half_float_vertex is missing
Debug: ----- CL_Shutdown -----

@illwieckz illwieckz force-pushed the illwieckz/gldetect branch 4 times, most recently from 7d3349c to df91237 Compare July 18, 2021 10:31
@illwieckz
Copy link
Member Author

illwieckz commented Jul 18, 2021

I rewrote the commit “new GL detection and selection code”, after some time I was not happy with the way I did it first, for example I discovered that if an hypothetic driver had 3.3 support but not 2.1, the code may not test for 3.3. So now the code is just testing every supported GL version, reminds the best supported one to use it, but does not stop on any unsupported version. The behaviour is now very similar to what does the glxinfo command line tool on Linux. Not only such rewrite make the code more strong and less convoluted, it makes logging more useful when debugging. One can ask an user to enable debug log, run the game and then send the daemon.log file, we would get a good picture of the OpenGL support.

Also, when an user has a too old OpenGL version, the error message tells the user which old OpenGL version he has.

I fixed a bug where the available OpenGL extension list was truncated.

The code now only test for existing GL versions to this day, so it will not test for a future GL 4.7 if it gets released one day, without patching this code. This is the way usual tools (like glxinfo) work.

About specific commits:

  • rewrite the original GL selection code to help to understand it
    The code modified by this commit get entirely rewritten later, this is a commit that is expected to work, but main purpose was to clean-up the code in order to make me understand it before rewriting it.
    It is not needed to do extensive review like checking for unused variables, this commit does not claim to fix unused variables. What's important to review is that the commit is not introducing mistakes or obvious wrong thing, because further rewrite is based on the assumption this commit does not break anything.
  • reuse already created SDL window
    This commit is another rewrite that helps to make the rewrite easier after that.
    Review should focus on testing this code on 16-bit and 24-bit Display on Windows and verify it works (and render is hardware accelerated).
  • new GL detection and selection code
    The actual GL detection and selection code to be used after this PR is merged.
    The diff is not really meaningfull, the review would be more efficient by looking at the new code. It's not a deviation of the old implementation, it's a new implementation to replace the old one.

Other commits are simple and small and can be reviewed as usual.

On Linux with Mesa, one may want to try those various environment variables to trigger some code path:

  • export LIBGL_DRIVERS_PATH=''
    SDL error for failing to load OpenGL Driver (Dæmon shutdowns before being able to call SDL_GetError).
  • export DISPLAY='none'
    SDL error for failing to open an X11 display (Dæmon shutdowns before being able to call SDL_GetError).
  • export MESA_GL_VERSION_OVERRIDE='0.1'
    SDL error for badValue (Dæmon shutdowns before being able to call SDL_GetError).
  • export MESA_GL_VERSION_OVERRIDE='1.0'
    Dæmon error about OpenGL not available (GL 1.0 does not exist but major 1 is valid).
  • export MESA_GL_VERSION_OVERRIDE='1.1'
    Dæmon error about OpenGL being too old.
  • export MESA_GL_VERSION_OVERRIDE='2.1'
    Dæmon using GL3 renderer in 2.x compatibility mode.
  • export MESA_GL_VERSION_OVERRIDE='2.1'; export MESA_EXTENSION_OVERRIDE='-GL_ARB_framebuffer_object'
    Dæmon error about missing ARB_framebuffer_object OpenGL extension.
  • export MESA_GL_VERSION_OVERRIDE='2.1'; export MESA_EXTENSION_OVERRIDE='-GL_ARB_half_float_vertex'
    Dæmon error about missing ARB_half_float_vertex OpenGL extension.
  • export MESA_GL_VERSION_OVERRIDE='3.2'
    Dæmon using GL3 renderer with 3.2 OpenGL version even if higher is available.
  • export MESA_GL_VERSION_OVERRIDE='4.6'
    Dæmon using GL3 renderer with 4.6 OpenGL version.

I noticed the engine gets a crash when overriding a GL version that does not exist (like MESA_GL_VERSION_OVERRIDE='5.7') but the root cause looks to be a bug in the underlying tech. An invalid pointer to GL extension string is returned by instead of either something valid or0` (meaning that would be an error), so the engine attempts to access this invalid pointer. I'm not sure we can do anything to catch this error, except catching the segfault itself.

@illwieckz
Copy link
Member Author

Some samples of various logs:

Debug: SDL borderless window created at 0,0 with 3840×2160 size 
Debug: Invalid context: 16-bit GL 1.1 compat 
Debug: Invalid context: 24-bit GL 1.1 compat 
Debug: Invalid context: 16-bit GL 1.2 compat 
Debug: Invalid context: 24-bit GL 1.2 compat 
Debug: Invalid context: 16-bit GL 1.3 compat 
Debug: Invalid context: 24-bit GL 1.3 compat 
Debug: Invalid context: 16-bit GL 1.4 compat 
Debug: Invalid context: 24-bit GL 1.4 compat 
Debug: Invalid context: 16-bit GL 1.5 compat 
Debug: Invalid context: 24-bit GL 1.5 compat 
Debug: Invalid context: 16-bit GL 2.0 compat 
Debug: Invalid context: 24-bit GL 2.0 compat 
Debug: Invalid context: 16-bit GL 2.1 compat 
Debug: Invalid context: 24-bit GL 2.1 compat 
Debug: Invalid context: 16-bit GL 3.0 compat 
Debug: Invalid context: 24-bit GL 3.0 compat 
Debug: Invalid context: 16-bit GL 3.1 compat 
Debug: Invalid context: 24-bit GL 3.1 compat 
Debug: Invalid context: 16-bit GL 3.2 core 
Debug: Invalid context: 24-bit GL 3.2 core 
Debug: Invalid context: 16-bit GL 3.3 core 
Debug: Invalid context: 24-bit GL 3.3 core 
Debug: Invalid context: 16-bit GL 4.0 core 
Debug: Invalid context: 24-bit GL 4.0 core 
Debug: Invalid context: 16-bit GL 4.1 core 
Debug: Invalid context: 24-bit GL 4.1 core 
Debug: Invalid context: 16-bit GL 4.2 core 
Debug: Invalid context: 24-bit GL 4.2 core 
Debug: Invalid context: 16-bit GL 4.3 core 
Debug: Invalid context: 24-bit GL 4.3 core 
Debug: Invalid context: 16-bit GL 4.4 core 
Debug: Invalid context: 24-bit GL 4.4 core 
Debug: Invalid context: 16-bit GL 4.5 core 
Debug: Invalid context: 24-bit GL 4.5 core 
Debug: Invalid context: 16-bit GL 4.6 core 
Debug: Invalid context: 24-bit GL 4.6 core 
Debug: Destroying 3840×2160 SDL window at 0,0 
Warn: OpenGL is not available.

You need a graphic card with drivers supporting at least OpenGL 3.2
or OpenGL 2.1 with ARB_half_float_vertex and ARB_framebuffer_object.
Debug: SDL borderless window created at 0,0 with 3840×2160 size 
Debug: Valid context: 16-bit GL 1.1 compat 
Debug: Valid context: 24-bit GL 1.1 compat 
Debug: Invalid context: 16-bit GL 1.2 compat 
Debug: Invalid context: 24-bit GL 1.2 compat 
Debug: Invalid context: 16-bit GL 1.3 compat 
Debug: Invalid context: 24-bit GL 1.3 compat 
Debug: Invalid context: 16-bit GL 1.4 compat 
Debug: Invalid context: 24-bit GL 1.4 compat 
Debug: Invalid context: 16-bit GL 1.5 compat 
Debug: Invalid context: 24-bit GL 1.5 compat 
Debug: Invalid context: 16-bit GL 2.0 compat 
Debug: Invalid context: 24-bit GL 2.0 compat 
Debug: Invalid context: 16-bit GL 2.1 compat 
Debug: Invalid context: 24-bit GL 2.1 compat 
Debug: Invalid context: 16-bit GL 3.0 compat 
Debug: Invalid context: 24-bit GL 3.0 compat 
Debug: Invalid context: 16-bit GL 3.1 compat 
Debug: Invalid context: 24-bit GL 3.1 compat 
Debug: Invalid context: 16-bit GL 3.2 core 
Debug: Invalid context: 24-bit GL 3.2 core 
Debug: Invalid context: 16-bit GL 3.3 core 
Debug: Invalid context: 24-bit GL 3.3 core 
Debug: Invalid context: 16-bit GL 4.0 core 
Debug: Invalid context: 24-bit GL 4.0 core 
Debug: Invalid context: 16-bit GL 4.1 core 
Debug: Invalid context: 24-bit GL 4.1 core 
Debug: Invalid context: 16-bit GL 4.2 core 
Debug: Invalid context: 24-bit GL 4.2 core 
Debug: Invalid context: 16-bit GL 4.3 core 
Debug: Invalid context: 24-bit GL 4.3 core 
Debug: Invalid context: 16-bit GL 4.4 core 
Debug: Invalid context: 24-bit GL 4.4 core 
Debug: Invalid context: 16-bit GL 4.5 core 
Debug: Invalid context: 24-bit GL 4.5 core 
Debug: Invalid context: 16-bit GL 4.6 core 
Debug: Invalid context: 24-bit GL 4.6 core 
Best context: 24-bit GL 1.1 compat 
Debug: Created best context: 24-bit GL 1.1 compat 
Using 24 Color bits, 0 depth, 8 stencil display. 
Debug: Provided OpenGL 1.1 version. 
Debug: Destroying 3840×2160 SDL window at 0,0 
Warn: OpenGL 1.1 is too old.

You need a graphic card with drivers supporting at least OpenGL 3.2
or OpenGL 2.1 with ARB_half_float_vertex and ARB_framebuffer_object. 
Debug: SDL borderless window created at 0,0 with 3840×2160 size 
Debug: Valid context: 16-bit GL 1.1 compat 
Debug: Valid context: 24-bit GL 1.1 compat 
Debug: Valid context: 16-bit GL 1.2 compat 
Debug: Valid context: 24-bit GL 1.2 compat 
Debug: Valid context: 16-bit GL 1.3 compat 
Debug: Valid context: 24-bit GL 1.3 compat 
Debug: Valid context: 16-bit GL 1.4 compat 
Debug: Valid context: 24-bit GL 1.4 compat 
Debug: Valid context: 16-bit GL 1.5 compat 
Debug: Valid context: 24-bit GL 1.5 compat 
Debug: Valid context: 16-bit GL 2.0 compat 
Debug: Valid context: 24-bit GL 2.0 compat 
Debug: Valid context: 16-bit GL 2.1 compat 
Debug: Valid context: 24-bit GL 2.1 compat 
Debug: Invalid context: 16-bit GL 3.0 compat 
Debug: Invalid context: 24-bit GL 3.0 compat 
Debug: Invalid context: 16-bit GL 3.1 compat 
Debug: Invalid context: 24-bit GL 3.1 compat 
Debug: Invalid context: 16-bit GL 3.2 core 
Debug: Invalid context: 24-bit GL 3.2 core 
Debug: Invalid context: 16-bit GL 3.3 core 
Debug: Invalid context: 24-bit GL 3.3 core 
Debug: Invalid context: 16-bit GL 4.0 core 
Debug: Invalid context: 24-bit GL 4.0 core 
Debug: Invalid context: 16-bit GL 4.1 core 
Debug: Invalid context: 24-bit GL 4.1 core 
Debug: Invalid context: 16-bit GL 4.2 core 
Debug: Invalid context: 24-bit GL 4.2 core 
Debug: Invalid context: 16-bit GL 4.3 core 
Debug: Invalid context: 24-bit GL 4.3 core 
Debug: Invalid context: 16-bit GL 4.4 core 
Debug: Invalid context: 24-bit GL 4.4 core 
Debug: Invalid context: 16-bit GL 4.5 core 
Debug: Invalid context: 24-bit GL 4.5 core 
Debug: Invalid context: 16-bit GL 4.6 core 
Debug: Invalid context: 24-bit GL 4.6 core 
Best context: 24-bit GL 2.1 compat 
Debug: Created best context: 24-bit GL 2.1 compat 
Using 24 Color bits, 0 depth, 8 stencil display. 
Debug: Provided OpenGL 2.1 version. 
Using GL3 Renderer in GL 2.x mode... 
Debug: SDL borderless window created at 0,0 with 3840×2160 size 
Debug: Valid context: 16-bit GL 1.1 compat 
Debug: Valid context: 24-bit GL 1.1 compat 
Debug: Valid context: 16-bit GL 1.2 compat 
Debug: Valid context: 24-bit GL 1.2 compat 
Debug: Valid context: 16-bit GL 1.3 compat 
Debug: Valid context: 24-bit GL 1.3 compat 
Debug: Valid context: 16-bit GL 1.4 compat 
Debug: Valid context: 24-bit GL 1.4 compat 
Debug: Valid context: 16-bit GL 1.5 compat 
Debug: Valid context: 24-bit GL 1.5 compat 
Debug: Valid context: 16-bit GL 2.0 compat 
Debug: Valid context: 24-bit GL 2.0 compat 
Debug: Valid context: 16-bit GL 2.1 compat 
Debug: Valid context: 24-bit GL 2.1 compat 
Debug: Invalid context: 16-bit GL 3.0 compat 
Debug: Invalid context: 24-bit GL 3.0 compat 
Debug: Invalid context: 16-bit GL 3.1 compat 
Debug: Invalid context: 24-bit GL 3.1 compat 
Debug: Invalid context: 16-bit GL 3.2 core 
Debug: Invalid context: 24-bit GL 3.2 core 
Debug: Invalid context: 16-bit GL 3.3 core 
Debug: Invalid context: 24-bit GL 3.3 core 
Debug: Invalid context: 16-bit GL 4.0 core 
Debug: Invalid context: 24-bit GL 4.0 core 
Debug: Invalid context: 16-bit GL 4.1 core 
Debug: Invalid context: 24-bit GL 4.1 core 
Debug: Invalid context: 16-bit GL 4.2 core 
Debug: Invalid context: 24-bit GL 4.2 core 
Debug: Invalid context: 16-bit GL 4.3 core 
Debug: Invalid context: 24-bit GL 4.3 core 
Debug: Invalid context: 16-bit GL 4.4 core 
Debug: Invalid context: 24-bit GL 4.4 core 
Debug: Invalid context: 16-bit GL 4.5 core 
Debug: Invalid context: 24-bit GL 4.5 core 
Debug: Invalid context: 16-bit GL 4.6 core 
Debug: Invalid context: 24-bit GL 4.6 core 
Best context: 24-bit GL 2.1 compat 
Debug: Created best context: 24-bit GL 2.1 compat 
Using 24 Color bits, 0 depth, 8 stencil display. 
Debug: Provided OpenGL 2.1 version. 
Using GL3 Renderer in GL 2.x mode... 
…
Initializing OpenGL extensions 
...ignoring GL_ARB_debug_output 
...found shading language version 460 
...using GL_ARB_half_float_pixel 
...using GL_ARB_texture_float 
...using GL_EXT_gpu_shader4 
...using GL_EXT_texture_integer 
...using GL_ARB_texture_rg 
...using GL_ARB_texture_gather 
...using GL_EXT_texture_compression_s3tc 
...using GL_ARB_texture_compression_rgtc 
...using GL_EXT_texture_filter_anisotropic 
Warn: Required extension GL_ARB_half_float_vertex is missing. 
Debug: SDL borderless window created at 0,0 with 3840×2160 size 
Debug: Valid context: 16-bit GL 1.1 compat 
Debug: Valid context: 24-bit GL 1.1 compat 
Debug: Valid context: 16-bit GL 1.2 compat 
Debug: Valid context: 24-bit GL 1.2 compat 
Debug: Valid context: 16-bit GL 1.3 compat 
Debug: Valid context: 24-bit GL 1.3 compat 
Debug: Valid context: 16-bit GL 1.4 compat 
Debug: Valid context: 24-bit GL 1.4 compat 
Debug: Valid context: 16-bit GL 1.5 compat 
Debug: Valid context: 24-bit GL 1.5 compat 
Debug: Valid context: 16-bit GL 2.0 compat 
Debug: Valid context: 24-bit GL 2.0 compat 
Debug: Valid context: 16-bit GL 2.1 compat 
Debug: Valid context: 24-bit GL 2.1 compat 
Debug: Valid context: 16-bit GL 3.0 compat 
Debug: Valid context: 24-bit GL 3.0 compat 
Debug: Valid context: 16-bit GL 3.1 compat 
Debug: Valid context: 24-bit GL 3.1 compat 
Debug: Valid context: 16-bit GL 3.2 core 
Debug: Valid context: 24-bit GL 3.2 core 
Debug: Invalid context: 16-bit GL 3.3 core 
Debug: Invalid context: 24-bit GL 3.3 core 
Debug: Invalid context: 16-bit GL 4.0 core 
Debug: Invalid context: 24-bit GL 4.0 core 
Debug: Invalid context: 16-bit GL 4.1 core 
Debug: Invalid context: 24-bit GL 4.1 core 
Debug: Invalid context: 16-bit GL 4.2 core 
Debug: Invalid context: 24-bit GL 4.2 core 
Debug: Invalid context: 16-bit GL 4.3 core 
Debug: Invalid context: 24-bit GL 4.3 core 
Debug: Invalid context: 16-bit GL 4.4 core 
Debug: Invalid context: 24-bit GL 4.4 core 
Debug: Invalid context: 16-bit GL 4.5 core 
Debug: Invalid context: 24-bit GL 4.5 core 
Debug: Invalid context: 16-bit GL 4.6 core 
Debug: Invalid context: 24-bit GL 4.6 core 
Best context: 24-bit GL 3.2 core 
Debug: Created best context: 24-bit GL 3.2 core 
Using 24 Color bits, 0 depth, 8 stencil display. 
Debug: Provided OpenGL 3.2 version. 
Using GL3 Renderer in GL 3.x mode... 
Debug: SDL borderless window created at 0,0 with 3840×2160 size 
Debug: Valid context: 16-bit GL 1.1 compat 
Debug: Valid context: 24-bit GL 1.1 compat 
Debug: Valid context: 16-bit GL 1.2 compat 
Debug: Valid context: 24-bit GL 1.2 compat 
Debug: Valid context: 16-bit GL 1.3 compat 
Debug: Valid context: 24-bit GL 1.3 compat 
Debug: Valid context: 16-bit GL 1.4 compat 
Debug: Valid context: 24-bit GL 1.4 compat 
Debug: Valid context: 16-bit GL 1.5 compat 
Debug: Valid context: 24-bit GL 1.5 compat 
Debug: Valid context: 16-bit GL 2.0 compat 
Debug: Valid context: 24-bit GL 2.0 compat 
Debug: Valid context: 16-bit GL 2.1 compat 
Debug: Valid context: 24-bit GL 2.1 compat 
Debug: Valid context: 16-bit GL 3.0 compat 
Debug: Valid context: 24-bit GL 3.0 compat 
Debug: Valid context: 16-bit GL 3.1 compat 
Debug: Valid context: 24-bit GL 3.1 compat 
Debug: Valid context: 16-bit GL 3.2 core 
Debug: Valid context: 24-bit GL 3.2 core 
Debug: Valid context: 16-bit GL 3.3 core 
Debug: Valid context: 24-bit GL 3.3 core 
Debug: Valid context: 16-bit GL 4.0 core 
Debug: Valid context: 24-bit GL 4.0 core 
Debug: Valid context: 16-bit GL 4.1 core 
Debug: Valid context: 24-bit GL 4.1 core 
Debug: Valid context: 16-bit GL 4.2 core 
Debug: Valid context: 24-bit GL 4.2 core 
Debug: Valid context: 16-bit GL 4.3 core 
Debug: Valid context: 24-bit GL 4.3 core 
Debug: Valid context: 16-bit GL 4.4 core 
Debug: Valid context: 24-bit GL 4.4 core 
Debug: Valid context: 16-bit GL 4.5 core 
Debug: Valid context: 24-bit GL 4.5 core 
Debug: Valid context: 16-bit GL 4.6 core 
Debug: Valid context: 24-bit GL 4.6 core 
Best context: 24-bit GL 4.6 core 
Debug: Created best context: 24-bit GL 4.6 core 
Using 24 Color bits, 0 depth, 8 stencil display. 
Debug: Provided OpenGL 4.6 version. 
Using GL3 Renderer in GL 3.x mode... 

@illwieckz
Copy link
Member Author

I consider the PR ready.

I tested it successfully on Linux and macOS.

I rewrote the first post

Copy link
Member

@slipher slipher left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I pushed one commit to the branch improving type safety which I made for my own reviewing benefit. Feel free to squash if needed.

I tested on Windows and didn't find anything obviously broken.

src/engine/renderer/tr_types.h Outdated Show resolved Hide resolved
src/engine/renderer/tr_public.h Show resolved Hide resolved
src/engine/sys/sdl_glimp.cpp Outdated Show resolved Hide resolved
src/engine/renderer/tr_init.cpp Outdated Show resolved Hide resolved

if ( glConfig2.glForwardCompatibleContext )
{
logger.Debug( "Provided OpenGL core context is forward compatible." );
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is logged in another place already

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unless I miss something the other place is in gfxinfo, the engine may stop before gfxinfo call is reached.

src/engine/sys/sdl_glimp.cpp Outdated Show resolved Hide resolved
src/engine/sys/sdl_glimp.cpp Outdated Show resolved Hide resolved
src/engine/sys/sdl_glimp.cpp Outdated Show resolved Hide resolved
return rserr_t::RSERR_OLD_GL;
}

static rserr_t GLimp_ApplyCustomOptions( const int GLEWmajor, const bool fullscreen, const bool bordered, const glConfiguration &bestConfiguration, glConfiguration &requestedConfiguration, bool &customOptions )
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems better for this function to only return a new configuration struct. The caller can do window/context creation calls.

Also can you just use the operator!= between the bestConfiguration and the customized options to get the answer to whether custom options were applied?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure to get how it would be better. I'm in favor of merging this as it works, and maybe evaluate more improvements later. This code is not wrong.

@slipher
Copy link
Member

slipher commented Feb 14, 2022

I tried to remove a lot of code duplication: PTAL at the slipher/gldetect branch in this repo. The biggest change is that when trying versions from the version table, it keeps track of the highest version we actually want to use, in addition to the absolutely highest version. So the ApplyPreferredContext function doesn't have to do a bunch of logic to find the highest version we want to use.

Most of these commits can probably be squashed.

@illwieckz
Copy link
Member Author

illwieckz commented Feb 14, 2022

OK thanks, for what I see in slipher/gldetect, there are many good things there, thanks a lot !

One comment though, on “unnecessary glExtendedValidation variable”:

If I remember correctly this was to detect the event of r_glExtendedValidation having been modified after startup and before vid_restart to not use cached results of previous validation when the cvar is modified, as if r_glExtendedValidation was disabled before and become enabled, code may still assume 3.2 is best version available without redoing tests starting from 4.6.

So whatever the way it is implemented, I want to be sure modifying r_glExtendedValidation redo the validation from scratch on next vid_restart.

@slipher
Copy link
Member

slipher commented Feb 14, 2022

I want to be sure modifying r_glExtendedValidation redo the validation from scratch on next vid_restart

Is there some reason to do extended validation from scratch every time instead of only the first time it is requested?

@illwieckz
Copy link
Member Author

illwieckz commented Feb 14, 2022

I'm not sure to get the question. Do you mean you have the idea to reset the cvar to zero once the extended validation is done?

@slipher
Copy link
Member

slipher commented Feb 14, 2022

No. I'm doing extended validation only the first time that the cvar is turned on, rather than every time it is changed.

@illwieckz
Copy link
Member Author

illwieckz commented Feb 14, 2022

I have two needs:

  • be able to ask someone to do extended validation so he can send me a complete log (the cvar does not have to be persistent);
  • be able to do extended validation anytime when validating bunch of hardware/software/environment combinations (the cvar being persistent can be useful, but I can workaround it not being persistent by scripting).

If I remember correctly, my initial code was redoing non-extended validation from scratch on vid_restart after switching r_glExtendedValidation off and was redoing extended validation from scratch on vid_restart after switching r_glExtendedValidation on.

There may be other ways to fulfill my needs, but at least this one had predictable behaviour: after a vid_restart the selected profile was one produced accordingly to r_glExtendedValidation even if it was one selected at a previous vid_restart call:

initial config: r_glExtendedValidation 0
engine start: basic validation
/vid_restart: reuse cached profile version from basic validation
/vid_restart: reuse cached profile version from basic validation
/set r_glExtendedValidation 1
/vid_restart: extended validation
/vid_restart: reuse cached profile version from extended validation
/vid_restart: reuse cached profile version from extended validation
/set r_glExtendedValidation 0
/vid_restart: basic validation
/vid_restart: reuse cached profile version from basic validation
/vid_restart: reuse cached profile version from basic validation

@slipher
Copy link
Member

slipher commented Feb 14, 2022

In my version, the max extended version no longer flows into the rest of the program, as the normal max version is stored in a separate variable. The max extended version is only used for the diagnostic log statements.

The validation function always populates the normal max version, and additionally populates the extended max version if the cvar is on. Both of these persist across vid_restarts.

The validation function is called if the normal max version is not yet known or if the cvar is on and the extended max version is not yet known.

@illwieckz
Copy link
Member Author

as the normal max version is stored in a separate variable.

Ah OK, I missed that, it looks like that does the job.

So once the max version is known, it's known until the end of the program, whatever the current value of the cvar, that's good. It looks to fulfill all my needs.

illwieckz added a commit that referenced this pull request Feb 16, 2022
…xinfo

This commit squashes multiple commits by illwieckz and slipher.

See #478

Signed-off-by: Thomas “illwieckz” Debesse <[email protected]>
Signed-off-by: slipher <[email protected]>

== squashed commits by illwieckz

:: sdl_glimp: rewrite the original GL selection code

- Detect best configuration possible.
- Try custom configuration if exists.
- If no custom configuration or if it fails,
  load the recommended configuration if
  possible (OpenGL 3.2 core) or the best
  one available.
- Reuse window and context when possible.
- Display meaningful popup telling user
  OpenGL version is too low or required
  extensions are missing when that happens.
- Rely on more return codes for GLimp_SetMode().
- Test for negative SDL_Init return value,
  not just -1.

:: sdl_glimp,tr_init: do not test all OpenGL versions unless r_glExtendedValidation is set

When r_glExtendedValidation is enabled, the engine
tests if OpenGL versions higher than 3.2 core
are supported for logging and diagnostic purpose,
but still requestes 3.2 core anyway. Some drivers
may provide more than 3.2 when requesting 3.2, this
is not our fault.

:: sdl_glimp,tr_init: rewrite logging, improve gfxinfo

- Move GL query for logging purpose from tr_init to sdl_glimp.
- Do not split MODE log message.
- Unify some log.
- Add more debug log when building GL extension list.
- Also increase the extensions_string length to not
  truncate the string, 4096 is not enough, there can
  be more than 6000 characters on an OpenGL 4.6 driver.
- Also log missing extensions to make gfxinfo more useful.
- Rewrite gfxinfo in more useful way.
- List enabled and missing GL extensions.

:: sdl_glimp: silence the GL error when querying if context
is core on non-core implementation

- Silence the error that may happen when querying if the
  OpenGL context uses core profile when core profile is not
  supported by the OpenGL implementation to begin with.

For example this may happen on implementations not supporting
higher than OpenGL 2.1, while forcing OpenGL 2.1 on
implementations supporting higher versions including
core profiles may not raise an error.

:: sdl_glimp: make GLimp_StartDriverAndSetMode only return true on RSERR_OK

- Only return true on OK, don't return true on unknown errors.

:: sdl_glimp: catch errors from GLimp_DetectAvailableModes to prevent further segfault

It may be possible to create a valid context that is unusable.

For example the 3840×2160 resolution is too large for the
Radeon 9700 and the Mesa r300 driver may print this error
when the requested resolution is higher than what is supported
by hardware:

> r300: Implementation error: Render targets are too big in r300_set_framebuffer_state, refusing to bind framebuffer state!

It will unfortunately return a valid but unusable context that will
make the engine segfault when calling GL_SetDefaultState().

:: sdl_glimp: flag fullscreen window as borderless when borderless is enabled

Flag fullscreen window as borderless when borderless is enabled
otherwise the window will be bordered when leaving fullscreen
while the borderless option would be enabled.

:: sdl_glimp,tr_init: remove unused depthBits

:: sdl_glimp: remove SDL_INIT_NOPARACHUTE

In GLimp_StartDriverAndSetMod() the SDL_INIT_NOPARACHUTE flag was
removed from SDL_Init( SDL_INIT_VIDEO ) as this flag is now
ignored, see https://wiki.libsdl.org/SDL_Init

== squashed commits by slipher

:: Better type safety in GL detection code

:: Simplify GL_ValidateBestContext duplicate loop

:: GLimp_SetMode - simplify error handling

:: Rework GL initialization

Have GLimp_ValidateBestContext() validate both the highest-numbered
available context (if r_glExtendedValidation is enabled), and the
highest context that we actually want to use (at most 3.2). This means
most of the code in GLimp_ApplyPreferredOptions can be removed because
it was duplicating the knowledge about version preferences and the code
for instantiating them in GLimp_ValidateBestContext.

:: Also cut down on other code duplication.

:: Remove dead cvar r_stencilBits

:: Fix glConfig.colorBits logging

- show actual not requested
- don't log it twice at notice level
illwieckz added a commit that referenced this pull request Feb 16, 2022
…xinfo

This commit squashes multiple commits by illwieckz and slipher.

See #478

Co-authored-by: Thomas “illwieckz” Debesse <[email protected]>
Co-authored-by: slipher <[email protected]>

== squashed commits by illwieckz

:: sdl_glimp: rewrite the original GL selection code

- Detect best configuration possible.
- Try custom configuration if exists.
- If no custom configuration or if it fails,
  load the recommended configuration if
  possible (OpenGL 3.2 core) or the best
  one available.
- Reuse window and context when possible.
- Display meaningful popup telling user
  OpenGL version is too low or required
  extensions are missing when that happens.
- Rely on more return codes for GLimp_SetMode().
- Test for negative SDL_Init return value,
  not just -1.

:: sdl_glimp,tr_init: do not test all OpenGL versions unless r_glExtendedValidation is set

When r_glExtendedValidation is enabled, the engine
tests if OpenGL versions higher than 3.2 core
are supported for logging and diagnostic purpose,
but still requestes 3.2 core anyway. Some drivers
may provide more than 3.2 when requesting 3.2, this
is not our fault.

:: sdl_glimp,tr_init: rewrite logging, improve gfxinfo

- Move GL query for logging purpose from tr_init to sdl_glimp.
- Do not split MODE log message.
- Unify some log.
- Add more debug log when building GL extension list.
- Also increase the extensions_string length to not
  truncate the string, 4096 is not enough, there can
  be more than 6000 characters on an OpenGL 4.6 driver.
- Also log missing extensions to make gfxinfo more useful.
- Rewrite gfxinfo in more useful way.
- List enabled and missing GL extensions.

:: sdl_glimp: silence the GL error when querying if context
is core on non-core implementation

- Silence the error that may happen when querying if the
  OpenGL context uses core profile when core profile is not
  supported by the OpenGL implementation to begin with.

For example this may happen on implementations not supporting
higher than OpenGL 2.1, while forcing OpenGL 2.1 on
implementations supporting higher versions including
core profiles may not raise an error.

:: sdl_glimp: make GLimp_StartDriverAndSetMode only return true on RSERR_OK

- Only return true on OK, don't return true on unknown errors.

:: sdl_glimp: catch errors from GLimp_DetectAvailableModes to prevent further segfault

It may be possible to create a valid context that is unusable.

For example the 3840×2160 resolution is too large for the
Radeon 9700 and the Mesa r300 driver may print this error
when the requested resolution is higher than what is supported
by hardware:

> r300: Implementation error: Render targets are too big in r300_set_framebuffer_state, refusing to bind framebuffer state!

It will unfortunately return a valid but unusable context that will
make the engine segfault when calling GL_SetDefaultState().

:: sdl_glimp: flag fullscreen window as borderless when borderless is enabled

Flag fullscreen window as borderless when borderless is enabled
otherwise the window will be bordered when leaving fullscreen
while the borderless option would be enabled.

:: sdl_glimp,tr_init: remove unused depthBits

:: sdl_glimp: remove SDL_INIT_NOPARACHUTE

In GLimp_StartDriverAndSetMod() the SDL_INIT_NOPARACHUTE flag was
removed from SDL_Init( SDL_INIT_VIDEO ) as this flag is now
ignored, see https://wiki.libsdl.org/SDL_Init

== squashed commits by slipher

:: Better type safety in GL detection code

:: Simplify GL_ValidateBestContext duplicate loop

:: GLimp_SetMode - simplify error handling

:: Rework GL initialization

Have GLimp_ValidateBestContext() validate both the highest-numbered
available context (if r_glExtendedValidation is enabled), and the
highest context that we actually want to use (at most 3.2). This means
most of the code in GLimp_ApplyPreferredOptions can be removed because
it was duplicating the knowledge about version preferences and the code
for instantiating them in GLimp_ValidateBestContext.

:: Also cut down on other code duplication.

:: Remove dead cvar r_stencilBits

:: Fix glConfig.colorBits logging

- show actual not requested
- don't log it twice at notice level
@illwieckz
Copy link
Member Author

@slipher I merged your fixes and squashed them. I added some Co-authored-by: lines in commit message to reflect that.

LGTM.

@illwieckz
Copy link
Member Author

@slipher given this PR now features the very latest changes of you you wanted to see over mines, I assume you're OK with the code in the current state. Do you have any other comment to do before I merge ?

@slipher
Copy link
Member

slipher commented Feb 20, 2022

I missed a couple chances to use the ContextDescription function in GLimp_CreateContext and with Sys::Error "GLEW initialization failed".

LGTM

@illwieckz
Copy link
Member Author

I missed a couple chances to use the ContextDescription function in GLimp_CreateContext and with Sys::Error "GLEW initialization failed".

Well, I'm not sure to entirely get it but it's always possible for someone to add another patch above this work one day. Thanks a lot for the deep review(s) your precious advises and the patience!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-Renderer T-Improvement Improvement for an existing feature
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants