Question about SDL_ConvertAudio

I had an unexpected result when using SDL_ConvertAudio. I was downsampling a 16-bit sample from 44100Hz to 22050Hz. The original had an odd number of samples and the converted had an odd number of bytes, which can’t be right with 16-bit samples. Am I doing something wrong? Here is a minimal test case:

Code:

#include <SDL.h>

int main(void) {
SDL_Init(SDL_INIT_VIDEO|SDL_INIT_AUDIO);
SDL_AudioCVT cvt;
SDL_BuildAudioCVT(&cvt, AUDIO_S16SYS, 1, 44100,
AUDIO_S16SYS, 1, 22050);
cvt.len = 4095 * sizeof(Sint16);
cvt.buf = SDL_malloc(cvt.len * cvt.len_mult);
SDL_ConvertAudio(&cvt);
return 0;
}

When I examine the contents of cvt after the call to SDL_ConvertAudio I see:

Code:

Breakpoint 1, main () at at.c:11
11 return 0;
(gdb) $1 = {
needed = 1,
src_format = 32784,
dst_format = 32784,
rate_incr = 0.5,
buf = 0x101874800 “”,
len = 8190,
len_cvt = 4095,
len_mult = 1,
len_ratio = 0.5,
filters = {0x10001c19d <SDL_Downsample_S16LSB_1c_x2>, 0, 0, 0, 0, 0, 0, 0, 0, 0},
filter_index = 1
}

Note the converted sample has 4095 bytes.
This is SDL 2.0.3 on OS X.
Thanks in advance,
Pete

[/code]