Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crash at start with 5120x1440 on linux #5124

Closed
robinxan opened this issue Nov 30, 2020 · 7 comments · Fixed by #5178
Closed

Crash at start with 5120x1440 on linux #5124

robinxan opened this issue Nov 30, 2020 · 7 comments · Fixed by #5178

Comments

@robinxan
Copy link
Contributor

Avalonia crashes at start even with a template app (just a empty window), when i am using the native resolution.
When using resolution below (3840x1080) it works.
I can cross out the window manager as this happens on XFCE and Openbox.
I have no idea if this is a general Issue or if something is broken/misconfigured on my side.
Any clues?

System Information:
OS: Manjaro Linux x86_64
Kernel: 5.4.80-2-MANJARO
X.Org: 1.20.9
Nvidia Driver: 450.80.02 (i also tried 455.45.01)

Exceptions:
0.9.12:

System.InvalidOperationException: Failed to create Skia render target surface
   at Avalonia.Skia.SurfaceRenderTarget..ctor(CreateInfo createInfo)
   at Avalonia.Skia.DrawingContextImpl.CreateRenderTarget(Size size, Nullable`1 format)
   at Avalonia.Skia.DrawingContextImpl.CreateLayer(Size size)
   at Avalonia.Rendering.RenderLayers.Update(Scene scene, IDrawingContextImpl context)
   at Avalonia.Rendering.DeferredRenderer.UpdateRenderLayersAndConsumeSceneIfNeeded(IDrawingContextImpl& context, Boolean recursiveCall)
   at Avalonia.Rendering.DeferredRenderer.Render(Boolean forceComposite)
   at Avalonia.Rendering.DeferredRenderer.Paint(Rect rect)
   at Avalonia.Controls.TopLevel.HandlePaint(Rect rect)
   at Avalonia.X11.X11Window.DoPaint()
   at Avalonia.X11.X11Window.<OnEventSync>b__81_0()
   at Avalonia.Threading.JobRunner.Job.Avalonia.Threading.JobRunner.IJob.Run()
   at Avalonia.Threading.JobRunner.RunJobs(Nullable`1 priority)
   at Avalonia.X11.X11PlatformThreading.HandleX11(CancellationToken cancellationToken)
   at Avalonia.X11.X11PlatformThreading.RunLoop(CancellationToken cancellationToken)
   at Avalonia.Threading.Dispatcher.MainLoop(CancellationToken cancellationToken)
   at Avalonia.Controls.ApplicationLifetimes.ClassicDesktopStyleApplicationLifetime.Start(String[] args)
   at Avalonia.ClassicDesktopStyleApplicationLifetimeExtensions.StartWithClassicDesktopLifetime[T](T builder, String[] args, ShutdownMode shutdownMode)
   at WorkTimeManager.core.Program.Main(String[] args) in /home/robin/gitprojekts/worktimemanager/worktimemanager.core/Program.cs:line 38

0.10.0-preview6:

System.NullReferenceException: Object reference not set to an instance of an object.
   at Avalonia.Skia.SurfaceRenderTarget..ctor(CreateInfo createInfo)
   at Avalonia.Skia.DrawingContextImpl.CreateRenderTarget(Size size, Nullable`1 format)
   at Avalonia.Skia.DrawingContextImpl.CreateLayer(Size size)
   at Avalonia.Rendering.RenderLayers.Update(Scene scene, IDrawingContextImpl context)
   at Avalonia.Rendering.DeferredRenderer.UpdateRenderLayersAndConsumeSceneIfNeeded(IDrawingContextImpl& context, Boolean recursiveCall)
   at Avalonia.Rendering.DeferredRenderer.Render(Boolean forceComposite)
   at Avalonia.Rendering.DeferredRenderer.Paint(Rect rect)
   at Avalonia.Controls.TopLevel.HandlePaint(Rect rect)
   at Avalonia.X11.X11Window.DoPaint()
   at Avalonia.X11.X11Window.<OnEventSync>b__104_0()
   at Avalonia.Threading.JobRunner.Job.Avalonia.Threading.JobRunner.IJob.Run()
   at Avalonia.Threading.JobRunner.RunJobs(Nullable`1 priority)
   at Avalonia.X11.X11PlatformThreading.HandleX11(CancellationToken cancellationToken)
   at Avalonia.X11.X11PlatformThreading.RunLoop(CancellationToken cancellationToken)
   at Avalonia.Threading.Dispatcher.MainLoop(CancellationToken cancellationToken)
   at Avalonia.Controls.ApplicationLifetimes.ClassicDesktopStyleApplicationLifetime.Start(String[] args)
   at Avalonia.ClassicDesktopStyleApplicationLifetimeExtensions.StartWithClassicDesktopLifetime[T](T builder, String[] args, ShutdownMode shutdownMode)
   at avaloniaTest.Program.Main(String[] args) in /home/robin/gitprojekts/avaloniaTest/Program.cs:line 13
@robinxan
Copy link
Contributor Author

robinxan commented Dec 9, 2020

The issue is caused by the broken EDID off my monitor, it doesn't report physical size of the monitor on 5120x1440, so randr reports a monitorsize of 1mm x 1mm.
Because of this avalonia attempts to create huge surfaces via skia.

XDisplayHeightMM and XDisplayWidthMM from libX11.so reports the correct values, the displaydriver (nvidia) is able to figure out the size and dpi of the panel, unfortunately this will not work properly with multimonitor setups.

@kekekeks
Copy link
Member

kekekeks commented Dec 9, 2020

Please, show the output of xrandr and xrandr --listmonitors

@robinxan
Copy link
Contributor Author

Here is the output, i removed unused modelines and disconnected displays to shorten the output.

$ xrandr
Screen 0: minimum 8 x 8, current 5120 x 1440, maximum 32767 x 32767
...
DP-2 connected primary 5120x1440+0+0 (normal left inverted right x axis y axis) 1mm x 1mm
   3840x1080    119.97 +  59.97  
   5120x1440    120.00*   59.98  
... 
$ xrandr --listmonitors
Monitors: 1
 0: +*DP-2 5120/1x1440/1+0+0  DP-2

I think the fundamental issue is that avalonia attempts to guess the scalingfactor from resolution of the screen and its physical size, but that has no meaning to the actual scaling used by the Desktop Environment or randr.
X11Sreens.cs@91:

var monitors = XRRGetMonitors(_x11.Display, _window, true, out var count);
var screens = new X11Screen[count];
for (var c = 0; c < count; c++)
{
    var mon = monitors[c];
    var namePtr = XGetAtomName(_x11.Display, mon.Name);
    var name = Marshal.PtrToStringAnsi(namePtr);
    XFree(namePtr);

    var density = 1d;
    if (_settings.NamedScaleFactors?.TryGetValue(name, out density) != true)
    {
        if (mon.MWidth == 0)
            density = 1;
        else
            density = X11Screen.GuessPixelDensity(mon.Width, mon.MWidth);
    }

    density *= _settings.GlobalScaleFactor;
    
    var bounds = new PixelRect(mon.X, mon.Y, mon.Width, mon.Height);
    screens[c] = new X11Screen(bounds, 
        mon.Primary != 0, 
        name,
        (mon.MWidth == 0 || mon.MHeight == 0) ? (Size?)null : new Size(mon.MWidth, mon.MHeight),
        density);
}

The guessed density is then later used as scaling factor for all rendered elements.
X11Windows.cs@555:

var monitor = _platform.X11Screens.Screens.OrderBy(x => x.PixelDensity)
    .FirstOrDefault(m => m.Bounds.Contains(Position));
newScaling = monitor?.PixelDensity ?? RenderScaling;

As far as i can tell randrs scaling is invisible to applications and should not be handled by applications.
Example: xrandr --output DP-2 --scale 1.25x1.25
See which mode and resolution is used:

$ xrandr
Screen 0: minimum 8 x 8, current 6400 x 1800, maximum 32767 x 32767
...
DP-2 connected primary 6400x1800+0+0 (normal left inverted right x axis y axis) 1mm x 1mm
   3840x1080    119.97 +  59.97  
   5120x1440    120.00*   59.98  
...  
$ xrandr --listmonitors
Monitors: 1
 0: +*DP-2 6400/1x1800/1+0+0  DP-2

And then there is window scaling implemented through the Desktop Enviroment(GNOME,KDE,XFCE,...), this seems to be completely ignored.
Example for XFCE:

$ xfconf-query -c xsettings -p /Gdk/WindowScalingFactor
1

I believe the proper way to detect the window scaling should be:
AVALONIA_SCREEN_SCALE_FACTORS -> Scalling from Desktop Enviroment -> Fallback to 1

Link to ArchWiki about HiDPI: https://wiki.archlinux.org/index.php/HiDPI#Xorg

@kekekeks
Copy link
Member

window scaling implemented through the Desktop Enviroment

IIRC desktop environments are intentionally not implementing per-monitor DPI for X11 to force migration to Wayland

@kekekeks
Copy link
Member

I guess we should limit the auto-detected scaling to 300% or something

@danwalmsley

@robinxan
Copy link
Contributor Author

I think its better to detect the screen scaling factor in a more reliable way, and then fallback to 100%.
QT is solving this issue by requesting the EDID from xorg and parsing it, if that fails they are falling back to 100%.

QEdidParser
The QEdidParser only parsers the Maximum image size and multiplies it by 10 since it is in cm not the physical size mapped to selected resolution.

Here is a shortened decode EDID from my monitor to show why this would work:

$ edid-decode < samsung.bin 
edid-decode (hex):

...

----------------

Block 0, Base EDID:
  EDID Structure Version & Revision: 1.4
  Vendor & Product Identification:
    Manufacturer: SAM
    Model: 28755
    Serial Number: 1129855301
    Made in: 2020
  Basic Display Parameters & Features:
    Digital display
    Bits per primary color channel: 10
    DisplayPort interface
    Maximum image size: 119 cm x 34 cm
    Gamma: 2.20
    DPMS levels: Off
    Supported color formats: RGB 4:4:4, YCrCb 4:4:4, YCrCb 4:2:2
    First detailed timing includes the native pixel format and preferred refresh rate
  Color Characteristics:
    Red  : 0.6943, 0.2929
    Green: 0.2744, 0.6591
    Blue : 0.1484, 0.0566
    White: 0.3134, 0.3291
  Established Timings I & II:
    ...
  Detailed Timing Descriptors:
    DTD 1:  3840x1080  119.974 Hz  32:9   137.250 kHz 549.000 MHz (1193 mm x 336 mm)
                 Hfront   48 Hsync  32 Hback  80 Hpol P
                 Vfront    3 Vsync  10 Vback  51 Vpol N
  Display Range Limits:
    Monitor ranges (GTF): 50-120 Hz V, 30-183 kHz H, max dotclock 970 MHz
    Display Product Name: 'LC49G95T'
    Display Product Serial Number: 'H4ZN600270'
  Extension blocks: 2
Checksum: 0xad

----------------

...

----------------

Block 2, DisplayID Extension Block:
  Version: 1.2
  Extension Count: 0
  Display Product Type: Extension Section
  Video Timing Modes Type 1 - Detailed Timings Data Block:
    DTD:  5120x1440  119.999 Hz   0:0   182.879 kHz 965.600 MHz (aspect undefined, no 3D stereo, preferred)
               Hfront   48 Hsync  32 Hback  80 Hpol P
               Vfront    3 Vsync  10 Vback  71 Vpol N
    DTD:  5120x1440   59.977 Hz   0:0    88.826 kHz 469.000 MHz (aspect undefined, no 3D stereo)
               Hfront   48 Hsync  32 Hback  80 Hpol P
               Vfront    3 Vsync  10 Vback  28 Vpol N
  Checksum: 0x4f
Checksum: 0x90

Maximum image size is correct as is the mapped size for 3840x1080 but 5120x1440 has no size or aspect ration information.
Creating an custom EDID for 5120x1440 was not possible, edid.s supplied by linux would fail to compile because values where truncated, and UI tools would not allow me to type in the resolution for being to high.
I believe this to be a limitation of EDID 1.4 and samung used a hack to get it in anyway.

@kekekeks
Copy link
Member

We can't really use Qt for source code reference due to its license.
Fortunately, Chromium has code for obtaining and parsing EDID.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants