The backing scale factor can change when the window moves between retina and non-retina displays.
The only other way to detect such a change is to compare the output of SDL_GL_GetDrawableSize or SDL_GetRendererOutputSize every frame, which is less than desirable, especially since the necessary app logic is likely already being executed when a window resize event is received.
This lets SDL-based apps respond to "Open With" commands properly, as they
can now obtain the requested path via a standard SDL dropfile event.
This is only checked on startup, so apps don't get drop events at any other
time, even if Android supports that, but this is still a definite
improvement.
Fixes Bugzilla #2762.
This apparently has fallout: the PS4 (and maybe PS3?) controllers apparently
report some bogus axes, but it won't change the axes we currently expect, and
thus the game controller config string is still stable.
Fixes Bugzilla #2719.
Since all device detection/removal happens on the main thread now, post events inline with when the status changes occur.
Also fixed rare cases when joystick API functions could return data about removed joysticks when called with a device index.
id.zeta
The optimization from Bug 11 added a code branch on cases where the source RGB masks match the destination RGB masks and a optimized blit function Blit4to4MaskAlpha that always overrides the source alpha info would be chosen. Unfortunately, the branch also errorneously took over the RGBA<->RGBA blitting cases where the source alpha info should be copied, while they would instead get overriden in Blit4to4MaskAlpha.
The attached patch fixes that by handling the RGBA<->RGBA cases correctly in that branch with the original BlitNtoNCopyAlpha as well as uses an optimized Blit4to4CopyAlpha along the same vein.
The expected use case is for games that are designed with multiple aspect ratios already in mind and leave optional margins on the edges of the game which won't hurt if they are cut off.
An example use case is a game is designed for wide-screen/16:9, but then wants to deploy on an iPad which is 4:3. Normally, SDL will letterbox, which will shrink things and result in wasted space. But the designer already thought about 4:3 and designed the edges of the game so they could be cut off without any functional loss. So rather than wasting space with letterboxing, "overscan" mode will zoom the rendering to fill up the entire screen. Parts on the edges will be drawn offscreen, but since the game was already designed with this in mind, it is fine. The end result is the iPad (4:3) experience is much better since it feels like a game designed for that screen aspect ratio.
This patch introduces a new SDL_hint: SDL_HINT_RENDER_LOGICAL_SIZE_MODE.
Valid values are "letterbox" or "0" for letterboxing and "overscan" or "1" for overscan.
The default mode is letterbox to preserve existing behavior.
// Example usage:
SDL_SetHint(SDL_HINT_RENDER_LOGICAL_SIZE_MODE, "overscan");
SDL_RenderSetLogicalSize(renderer, SCREEN_WIDTH, SCREEN_HEIGHT);
Elias Vanderstuyft
Remove the dependency of the calculation of Linux "phase" on "period",
currently the "phase" parameter is interpreted as a time shift, instead of a phase shift.
The Linux input documentation is not clear about the exact units of the "phase" parameter (see http://lxr.free-electrons.com/source/include/uapi/linux/input.h?v=3.17#L1075 ),
but we're about to standardize the 'phase shift' interpretation into the Linux input documentation,
since this will ease the job of a driver to recalculate the effect's state when the user dynamically updates the "period" parameter.
Elias Vanderstuyft
It's not obvious from the general "haptic direction" description what the SDL direction actually means in terms of force magnitude sign,
currently its meaning is only reflected by the example.
This change does a few things, all with regards to the WinRT implementation of
SDL_GetPrefPath():
1. it fixes a bug whereby SDL_GetPrefPath() did not create the directory it
returned. On other SDL platforms, SDL_GetPrefPath() will create separate
directories for its 'org' and 'app' folders. Without this, attempts to create
files in the pref-path would fail, unless those directories were first created
by the app, or by some other library the app used. This change makes sure
that these directories get created, before SDL_GetPrefPath() returns to its
caller(s).
2. it defaults to having SDL_GetPrefPath() return a WinRT 'Local' folder
on all platforms. Previously, for Windows Store apps, it would have used a
different, 'Roaming' folder. Files in Roaming folders can be automatically,
and synchronized across multiple devices by Windows. This synchronization can
happen while the app runs, with new files being copied into a running app's
pref-path. Unless an app is specifically designed to handle this scenario,
there is a chance that save-data could be overwritten in unwanted or
unexpected ways.
The default is now to use a Local folder, which does not get synchronized, and
which is arguably a bit safer to use. Apps that wish to use Roaming folders
can do so by setting SDL_HINT_WINRT_PREF_PATH_ROOT to "roaming", however it
is recommended that one first read Microsoft's documentation for Roaming
files, a link to which is provided in README-winrt.md.
To preserve older pref-path selection behavior (found in SDL 2.0.3, as well as
many pre-2.0.4 versions of SDL from hg.libsdl.org), which uses a Roaming path
in Windows Store apps, and a Local path in Windows Phone, set
SDL_HINT_WINRT_PREF_PATH_ROOT to "old".
Please note that Roaming paths are not supported on Windows Phone 8.0, due to
limitations in the OS itself. Attempts to use this will fail.
(Windows Phone 8.1 does not have this limitation, however.)
3. It makes SDL_GetPrefPath(), when on Windows Phone 8.0, and when
SDL_HINT_WINRT_PREF_PATH_ROOT is set to "roaming", return NULL, rather than
silently defaulting to a Local path (then switching to a Roaming path if and
when the user upgraded to Windows Phone 8.1).
Jonas Kulla
src/main/windows/SDL_windows_main.c:137:
cmdline = SDL_iconv_string("UTF-8", "UCS-2-INTERNAL", (char *)(text), (SDL_wcslen(text)+1)*sizeof(WCHAR));
I'm trying to compile an SDL2 application for windows using the mingw-w64 32bit toolchain provided by my distro (Fedora 19). However, even the simplest test program that does nothing at all fails to startup with a "Fatal error - out of memory" message because the mingw iconv library provided by my distro does not support the "UCS-2-INTERNAL" encoding and the conversion returns null.
From my little bit of research, it turns out that even though this encoding is supported by the external GNU libiconv library, some glibc versions (?) don't support it with their internal iconv routines, and will instead provide the native endian encoding when "UCS-2" is specified.
Nonetheless, I wonder why the native endianness is considered in the first place when Windows doesn't even run on any big endian archs (to my knowledge). And true enough, 'WIN_StringToUTF8' from core/windows/SDL_windows.h is used everywhere else in the windows backend, which is just a macro to iconv with "UTF-16LE" as source. Therefore it would IMO make sense to use this macro here as well, which would solve my problem (patch attached).
Nitz
I added xdnd_version check to XdndPosition case also
under DEBUG_XEVENTS macro.
by this we can get the action requested by user.
I analysed further and found out that removing xdnd_version check at XdndDrop case is a bad idea because
in XConvertSelection API timestamp should be passed if(xdnd_version >= 1)
otherwise CurrentTime should be passed
So xdnd_version check is important at XdndDrop case
I made xdnd_version as a static so that it can store the version in other cases also.