From 4c2be472071bf676390214cb942e4569f57a158f Mon Sep 17 00:00:00 2001 From: "Ryan C. Gordon" Date: Tue, 7 Apr 2020 14:37:24 -0400 Subject: [PATCH] wasapi: Improve WASAPI audio backend latency (thanks, Anthony!). Anthony Pesch's notes on his patch: "Currently, the WASAPI backend creates a stream in shared mode and sets the device's callback size to be half of the shared stream's total buffer size. This works, but doesn't coordinate will with the actual hardware. The hardware will raise an interrupt after every period which in turn will signal the object being waited on inside of WaitDevice. From my empirical testing, the callback size was often larger than the period size and not a multiple of it, which resulted in poor latency when trying to time an application based on the audio callback. The reason for this looked something like: * The device's callback would be called and and the audio buffer was filled. * WaitDevice would be called. * The hardware would raise an interrupt after one period. * WaitDevice would resume, see that a a full callback had not been played and then wait again. * The hardware would raise an interrupt after another period. * WaitDevice would resume, see that a full callback + some extra amount had been played and then it would again call our callback and this process would repeat. The effect of this is that the pacing between subsequent callbacks is poor - sometimes it's called very quickly, sometimes it's called very late. By matching the callback's size to the stream's period size, the pacing of calls to the user callback is improved substantially. I didn't write an actual test for this, but my use case for this was my Dreamcast emulator (https://redream.io) which uses the audio callback to help drive the emulation speed. Without this change and with the default shared stream buffer (which has a period of ~10ms) I would get frame times that were between ~3-30 milliseconds; after this change I get frame times of ~11-22 milliseconds. Note, this patch also has a change that removes passing a duration to the Initialize call. It seems that the default duration used (when 0 is passed) does typically match up with the duration returned by GetDevicePeriod, however the Initialize docs say: > To set the buffer to the minimum size required by the engine thread, the > client should call Initialize with the hnsBufferDuration parameter set to 0. > Following the Initialize call, the client can get the size of the resulting > buffer by calling IAudioClient::GetBufferSize. This change isn't strictly required, but I made it to hopefully rule out another source of unexpected latency." Fixes Bugzilla #4592. --- src/audio/wasapi/SDL_wasapi.c | 15 ++++++++------- 1 file changed, 8 insertions(+), 7 deletions(-) diff --git a/src/audio/wasapi/SDL_wasapi.c b/src/audio/wasapi/SDL_wasapi.c index 66b13ab31..ea80e53b9 100644 --- a/src/audio/wasapi/SDL_wasapi.c +++ b/src/audio/wasapi/SDL_wasapi.c @@ -507,7 +507,7 @@ WASAPI_PrepDevice(_THIS, const SDL_bool updatestream) const SDL_AudioSpec oldspec = this->spec; const AUDCLNT_SHAREMODE sharemode = AUDCLNT_SHAREMODE_SHARED; UINT32 bufsize = 0; /* this is in sample frames, not samples, not bytes. */ - REFERENCE_TIME duration = 0; + REFERENCE_TIME default_period = 0; IAudioClient *client = this->hidden->client; IAudioRenderClient *render = NULL; IAudioCaptureClient *capture = NULL; @@ -571,7 +571,7 @@ WASAPI_PrepDevice(_THIS, const SDL_bool updatestream) return SDL_SetError("WASAPI: Unsupported audio format"); } - ret = IAudioClient_GetDevicePeriod(client, NULL, &duration); + ret = IAudioClient_GetDevicePeriod(client, &default_period, NULL); if (FAILED(ret)) { return WIN_SetErrorFromHRESULT("WASAPI can't determine minimum device period", ret); } @@ -590,7 +590,7 @@ WASAPI_PrepDevice(_THIS, const SDL_bool updatestream) } streamflags |= AUDCLNT_STREAMFLAGS_EVENTCALLBACK; - ret = IAudioClient_Initialize(client, sharemode, streamflags, duration, sharemode == AUDCLNT_SHAREMODE_SHARED ? 0 : duration, waveformat, NULL); + ret = IAudioClient_Initialize(client, sharemode, streamflags, 0, 0, waveformat, NULL); if (FAILED(ret)) { return WIN_SetErrorFromHRESULT("WASAPI can't initialize audio client", ret); } @@ -605,10 +605,11 @@ WASAPI_PrepDevice(_THIS, const SDL_bool updatestream) return WIN_SetErrorFromHRESULT("WASAPI can't determine buffer size", ret); } - this->spec.samples = (Uint16) bufsize; - if (!this->iscapture) { - this->spec.samples /= 2; /* fill half of the DMA buffer on each run. */ - } + /* Match the callback size to the period size to cut down on the number of + interrupts waited for in each call to WaitDevice */ + float period_millis = default_period / 10000.0f; + float period_frames = period_millis * this->spec.freq / 1000.0f; + this->spec.samples = (Uint16)ceil(period_frames); /* Update the fragment size as size in bytes */ SDL_CalculateAudioSpec(&this->spec);