This allows us to handle controllers that use the Xbox protocol but look like Nintendo Switch or Playstation controllers, like the Qanba Dragon Arcade Stick in PC mode
This is done such that we can disable LTO for these 2 functions when
building with MSVC.
This is due to a limitation of Link Time Code Generation (LTCG).
Code generation might generate a new reference to memset after linking
has started. The LTCG must make assumptions about where memset is
defined which is normally the C runtime.
Some compositors will provide 'nicer' / 'human readable' output descriptions via the xdg-output protocol. Use these description strings, when available, instead of the model name provided by wl-output. On compositors such as GNOME where this is provided, the display names provided to applications by SDL will now match those in the desktop display settings panel. On compositors where this data isn't provided, the old behavior of using the model string provided by wl-output will remain unchanged.
Additionally, per the protocol spec, output data provided by xdg-output should supersede wl-output data, so this is the recommended behavior in general.
If the width is sufficiently ludicrous, then the calculated pitch or
the image size could conceivably be a signed integer overflow, which
is undefined behaviour. Calculate in the unsigned size_t domain, with
overflow checks.
Signed-off-by: Simon McVittie <smcv@collabora.com>
Adds hint "SDL_WINDOWS_DPI_SCALING" which can be set to "1" to
change the SDL coordinate system units to be DPI-scaled points, rather
than pixels everywhere.
This means windows will be appropriately sized, even when created on
high-DPI displays with scaling.
e.g. requesting a 640x480 window from SDL, on a display with 125%
scaling in Windows display settings, will create a window with an
800x600 client area (in pixels).
Setting this to "1" implicitly requests process DPI awareness
(setting SDL_WINDOWS_DPI_AWARENESS is unnecessary),
and forces SDL_WINDOW_ALLOW_HIGHDPI on all windows.
If the move results in a DPI change, we need to allow the window to resize (e.g. AdjustWindowRectExForDpi frame sizes are different).
- WM_DPICHANGED: Don't assume WM_GETDPISCALEDSIZE is always called for PMv2 awareness - it's only called during interactive dragging.
- WIN_AdjustWindowRectWithStyle: always calculate final window size including frame based on the destination rect,
not based on the current window DPI.
- Update wmmsg.h to include WM_GETDPISCALEDSIZE (for WMMSG_DEBUG)
- WIN_AdjustWindowRectWithStyle: add optional logging
- WM_GETMINMAXINFO: add optional HIGHDPI_DEBUG logging
- WM_DPICHANGED: fix potentially clobbering data->expected_resize
Together these changes fix the following scenario:
- launch testwm2 with the SDL_WINDOWS_DPI_AWARENESS=permonitorv2 environment variable
- Windows 10 21H2 (OS Build 19044.1706)
- Left (primary) monitor: 3840x2160, 125% scaling
- Right (secondary) monitor: 2560x1440, 100% scaling
- Alt+Enter, Alt+Enter (to enter + leave desktop fullscreen), Alt+Right (to move window to right monitor). Ensure the window client area stays 640x480. Drag the window back to the 125% monitor, ensure client area stays 640x480.
The hint allows setting a specific DPI awareness ("unaware", "system", "permonitor", "permonitorv2").
This is the first part of High-DPI support on Windows ( https://github.com/libsdl-org/SDL/issues/2119 ).
It doesn't implement a virtualized SDL coordinate system, which will be
addressed in a later commit. (This hint could be useful for SDL apps
that want 1 SDL unit = 1 pixel, though.)
Detecting and behaving correctly under per-monitor V2
(calling AdjustWindowRectExForDpi where needed) should fix the
following issues:
https://github.com/libsdl-org/SDL/issues/3286https://github.com/libsdl-org/SDL/issues/4712
I was looking at how errors are handled by SDL and came across this #define SDL_ERRBUFIZE which looks like a typo for SDL_ERRBUFSIZE, but either way, it looks like this isn't being used anywhere anymore because it was getting reported whenever I compile SDL with -Wunused-macros, and the last time it was mentioned in the code was from commit 09ca66b.
XDG-toplevel min/max size values are double-buffered data and must be committed before entering the fullscreen state, or a max window size value smaller than the display dimensions may cause the compositor to incorrectly configure the fullscreen window size. This fixes windowed->fullscreen transitions on GNOME, where, previously, certain combinations of window flags and min/max size values could cause entering fullscreen mode to fail with odd window sizes and/or offsets due to the new max size values not being committed before entering fullscreen, causing the compositor to clamp to the old values.
In the case of libdecor, it has its own layer of buffering on top of the xdg-toplevel surface for the min/max window dimensions, so both a frame commit and surface commit are required to set the state properly.
Previously, the surface damage region was being set in the same callback used for preventing render hangs in the GL backend when the surface was not visible. This was not ideal, as the callback was never fired in the case of using a different render backend or having a swap interval of 0. Use a separate frame callback for setting the surface damage region to ensure that it fires reliably, regardless of the backend being used or swap interval.
Some compositors (GNOME for example) don't set the transform flag when dealing with portrait mode displays, so the video modes won't have the width/height swapped in all cases where they should be. Check for both the 90/270 degree transform flag and if the display is taller than it is wide when determining whether to swap the width and height of the emulated video modes, and adjust the comparison logic when size testing against the native mode to account for this.
Add the hint "SDL_VIDEO_WAYLAND_MODE_EMULATION", which can be used to disable mode emulation under Wayland. When disabled, only the desktop and/or native display resolution is exposed.