Substring
I was trying the KMSDRM video backend with some very simple programs that were working ok on 2.0.12. The same code won?t work on the current dev branch and I get:
DEBUG: check_modesetting: probing ?/dev/dri/card0?
DEBUG: /dev/dri/card0 connector, encoder and CRTC counts are: 4 5 6
DEBUG: check_modesetting: probing ?/dev/dri/card0?
DEBUG: /dev/dri/card0 connector, encoder and CRTC counts are: 4 5 6
DEBUG: KMSDRM_VideoInit()
DEBUG: Opening device /dev/dri/card0
DEBUG: Opened DRM FD (3)
DEBUG: no atomic modesetting support.
DEBUG: Video subsystem has not been initialized
INFO: Using SDL video driver: (null)
DEBUG: Video subsystem has not been initialized
After carefully checking, the radeon driver doesn?t support atomic modesetting. That?s not the only problem : the same happens with the amdgpu driver if we disable Display Core (kernel parameter amdgpu.dc=0, which is required to get analogue outputs working).
This is a major regression in the KMSDRM driver.
Using atomic mode setting is great, but having no fallback to the "standard KMS" is bad.
configure output is practically unchanged. there are still lots of
AC_TRY_COMPILE/AC_TRY_LINK replacements needed to really eliminate
the warnings, but that's for another time.
Alex S
Evdev headers aren't actually included in the base system (well, it has a private copy), they are available through the devel/evdev-proto port instead. We also have devel/libinotify and devel/libudev-devd shims, I didn't verify whether they work with SDL.
In recent versions of EGL headers on Linux, the MESA_EGL_NO_X11_HEADERS macro is
deprecated and has been replaced with EGL_NO_X11. As a result, the configure
script would fail the compilation check for EGL headers and disable EGL (and by
extension, Wayland) support when X11 headers are not installed. Fix this by
adding the correct macro to disable X11 support in the headers.
wahil1976
This patch adds the KBIO text input driver for FreeBSD, which allows text input to fully work without text spilling out into the console. It also supports accented input, AltGr keys and Alt Lock combinations.
Tested with US accent keys layout and various AltGr layouts.
Alex S
...which allows SDL to talk to webcamd/iichid. (Webcamd actually bundles quite a few gamepad drivers.) Note that this does _not_ disable usbhid, both joystick backends will be compiled.
This works on capability bitfields that can either come from udev or
from ioctls, so it is equally applicable to both udev and non-udev
input device detection.
Signed-off-by: Simon McVittie <smcv@collabora.com>
This allows one to build Raspberry Pi versions on an ancient version of
Raspbian and get both the KMSDRM and RPI video targets built into SDL, giving
maximum binary compatibility from linking against an older glibc, etc, but
also making one library that can access video on all RPi models and OS
releases.
- SDL_config.h.in: add missing defines SDL_SENSOR_COREMOTION
and SDL_SENSOR_WINDOWS (configure did set SDL_SENSOR_WINDOWS
but it never went in SDL_config.h or Makefile.)
- SDL_config.h.cmake: remove duplicated SDL_SENSOR_XXX cmake
defines.
- autofoo, cmake: check for sensorsapi.h header before enabling
windows sensors.
jackmacwindowslinux
I'm testing my application that uses SDL2 on the new Apple Silicon Macs. I set up the SDL 2.0.12 source code from the website and tried to build it. The first issue I ran into was that it was always building OpenGL ES, even if --disable-video-opengles was passed to configure. OpenGL ES headers do not seem to be present on the Apple Silicon macOS SDK, except for the iOS SDK headers. Then I had problems with the joystick driver, where some classes used on iOS were not available on macOS.
After looking through the configure.ac script a bit, I found that iOS targets are selected when the build host matches "arm*-apple-darwin*". Clang on macOS 11.0 on arm64 reports the host as "arm64-apple-darwin20.0.0", which matches the iOS target. This means that ARM Mac compilation will always be detected as iOS. Unfortunately, there doesn't seem to be an easy way to detect Mac vs. iOS targets, since they now both use the same triplet & compiler for building.
I'm not sure what the best way to fix this is, but maybe there could be an additional target flag to specify whether to build for macOS or iOS? This might break compatibility, though: with this approach, either all old scripts that used configure to build for iOS fail, or all new builds on macOS without a flag fail (silently?).
The compiler understands it, but the "qcc" compiler driver doesn't, and the
standard Khronos headers upset QNX anyhow, since they try to include X11
headers in the __unix__ section.