Bug: SDL causes heap corruption if Windows Low Fragmentation Heap set _before_ call to SDL_Init()

My application has a memory allocation pattern that generates a lot of fragmentation, so I enabled the Windows low fragmentation heap as follows (note that in Vista this allocator is used by default, I’m dealing with XP):

HANDLE heaps[1024];
total = GetProcessHeaps(1024, heaps);
for (int i = 0; i < total; ++i)
{
??? unsigned long HeapFragValue(2ul);
??? HeapSetInformation(heaps[i], HeapCompatibilityInformation, &HeapFragValue, sizeof(HeapFragValue));
}

If this code is put before SDL_Init() as opposed to after it, I get:

The instruction at “0x696131f6” referenced memory at “0x00000770”.
The memory could not be “read”. Click OK to terminate the program

The error happens not during SDL_Init() but later on.

HeapQueryInformation(heaps[i], HeapCompatibilityInformation, &HeapInformation, sizeof(HeapInformation), &length) always returns 1. HeapSetInformation always fails for i == 2 but there is no other problem with that. If I skip i == 4 then there is no runtime error if I run this code before SDL_Init().

So, what gives?

My application has a memory allocation pattern that generates a lot of
fragmentation, so I enabled the Windows low fragmentation heap as follows
(note that in Vista this allocator is used by default, I’m dealing with
XP):

<<<< SNIP >>>>

So, what gives?

What version of SDL are you using? 1.2.13 or 1.3?