Sound output latency

Hello,

Is there a way to determine precisely ( with precision of <100 ms ) the
delay between the moment when data is written into buffer by callback
function and the moment when it is actually played? I’m writing a
media player and this info is necessary for proper audio&video sync.
Can anyone help me?–
Best regards,
Eugene mailto:@Eugene_Smith

[Team GADGET] [Team Two Divided By Zero] [Team Hackzone.ru]

Eugene Smith schrieb am 11 Aug 2000:

Hello,

Is there a way to determine precisely ( with precision of <100 ms ) the
delay between the moment when data is written into buffer by callback
function and the moment when it is actually played? I’m writing a
media player and this info is necessary for proper audio&video sync.
Can anyone help me?

AFAIK there is no way to tell when the sound card finally does process
the buffer contents. But if you choose a reasonably small buffer
size, you can assume that the contents will be played ‘pretty soon’
(i.e. within the next 10-50 ms). Beware of any sound daemons
doing additional buffering / mixing (ksound, esd etc.).

Hello,

The problem is not only in buffer size. There’s also DMA buffer
inside kernel, which can sometimes be as large as 128 kb ( 1.5
seconds of 44100/16/mono ). So far I did audio output directly
through /dev/dsp and was able to directly ask sound driver about
used DMA buffer size. That worked for me, but not for everyone. Now
I’m considering using SDL for output, but faced with this problem…–
Best regards,
Eugene mailto:@Eugene_Smith

[Team GADGET] [Team Two Divided By Zero] [Team Hackzone.ru]

Is there a way to determine precisely ( with precision of <100 ms ) the
delay between the moment when data is written into buffer by callback
function and the moment when it is actually played? I’m writing a
media player and this info is necessary for proper audio&video sync.

SDL has no provisions for that. It is possible for some sound
hardware/drivers (for instance, Solaris can send a signal each time a
special marker in the audio stream is processed synchronously), but SDL
just to play it “as soon as possible”.

SDL calls the callback function a little before the current buffer would
run low, to give the game enough time to mix together a chunk of audio.
But it is probably not enough time to do heavier audio processing such
as mp3 decoding, at least not on slow machines.

Is there a way to determine precisely ( with precision of <100 ms ) the
delay between the moment when data is written into buffer by callback
function and the moment when it is actually played? I’m writing a
media player and this info is necessary for proper audio&video sync.

SDL has no provisions for that. It is possible for some sound
hardware/drivers (for instance, Solaris can send a signal each time a
special marker in the audio stream is processed synchronously), but SDL
just to play it “as soon as possible”.

SDL calls the callback function a little before the current buffer would
run low, to give the game enough time to mix together a chunk of audio.
But it is probably not enough time to do heavier audio processing such
as mp3 decoding, at least not on slow machines.

That is correct, although given the proper buffer size, you can interpolate
between calls to your callback to find out when the audio data is actually
playing. In all environments, SDL will attempt to call your callback one
buffer before it’s needed:

playing starts
your audio function is called
your buffer starts playing
your audio function is called and buffer is queued for playback
your first buffer stops and second buffer plays
your audio function is called and buffer is queued for playback
your second buffer stops and third buffer plays

etc.
Some audio drivers are broken with respect to buffering at a lower level
without letting SDL know, but in general this works fairly well. On some
hardware, you can do direct DMA to the audio buffer for precise latency
audio, but SDL doesn’t enable this by default as it’s not supported by
most audio drivers and deprecated by driver authors.

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

Is there a way to determine precisely ( with precision of <100 ms ) the
delay between the moment when data is written into buffer by callback
function and the moment when it is actually played? I’m writing a
media player and this info is necessary for proper audio&video sync.
Can anyone help me?

This varies wildly from backend to backend. If you’re using ESD, abandon
every hope; it’ll be over 500ms in many cases. The normal OSS driver will
be very fast (with small fragment sizes), and the DMA driver is absurdly
fast (but only available on some drivers). You can influence the driver
selection with environment variables. Unfortunately, I don’t know if it’s
possible to get an exact measurement of latency.

-John