SDL2 - Huge Memory Leak On Multiple Window

Hi,
We have been trying to initialize 2 window with SDL2 (the latest version of this time, 2.0.22), however, a huge memory leak appears when we do this.

This can be achieved just by creating two OpenGL (any version & either core or compatibility mode) contexts and drawing with either SKIA or direct OpenGL.

The sample code only contains 2 functions, main and paint. Paint first makes the context of the current window current (SDL_GL_MakeCurrent), then draws a Hello Triangle with vertices, then we swap the buffers with SDL_GL_SwapWindow (flush and finish are there too, tested with and without, so we did not forget those).

On some Windows systems, the leak first happens 1 mb per a couple of frames, after a couple of seconds, the memory leak jumps to 1 mb per frame, increasing 100-200 mb per second until it crashes DWM or the entire OS.

As a note, when this happens, the fps is on the ground as well, not going above 100-150. When we have only one window, the fps is around 2500, with 2 simple windows drawing only a hello triangle, it drops to 100 fps in both windows’.

– DEBUGGING NOTES –
Debugged in Visual Studio, the heap allocations seem to be as less as 30-40 mb during the process using way beyond that, such as 1500 2000 mb.
The memory jumps 1 mb every frame after SDL_GL_SwapWindow. Can be seen by debugging in Visual Studio and stepping one by one when the leak is happening.

Tested in VM, 2 GPUs, 3 drivers.
In VM: Does not seem to happen (only been able to test with a fake OpenGL driver which renders with CPU). However, the leak continues to happen even with a fake OpenGL driver (software) on the non-VM systems.

IMPORTANT NOTE: After some time (we allowed the process to eat as much as it can), it stopped growing after 2400 mb, then started to go down to 30 mb, although the ram usage of the computer continued to increase until we terminate the process. The leak does not seem to happen from the process after that time, it is leaking somewhere else (e.g. a kernel-mode driver). Just to make sure, the leak goes away and the ram drops back to its usual value after we terminate the process.

We tried to use wglMakeCurrent and wglSwapLayerBuffers directly without bothering with SDL2, but the leak continues to happen.

CPU (if required): i5 10400 and i7 6800k
GPU: GTX 1050 and GTX 1080 ti

Seem to happen in both computers, both are Windows 10, os versions below for each GPU:
GTX 1050: Microsoft Windows - Version 21H1 (OS Build 19043.1526)
GTX 1080 TI: Microsoft Windows - Version 21H1 (OS Buil 19043.1706)

– FURTHER INFORMATION –
SDL Initialization Flags (First Test): SDL_INIT_VIDEO | SDL_INIT_EVENTS
SDL Initialization Flags (Second Test): SDL_INIT_EVENTS
SDL Initialization Flags (Third Test): 0

SDL Window Flags (First Test): SDL_WINDOW_RESIZABLE | SDL_WINDOW_OPENGL
SDL Window Flags (Second Test): SDL_WINDOW_OPENGL

SDL Window Creation Hints (First Test): [SDL_GL_CONTEXT_MAJOR_VERSION, 3], [SDL_GL_CONTEXT_MINOR_VERSION, 2], [SDL_GL_DOUBLEBUFFER, 1], [SDL_GL_DEPTH_SIZE, 24], [SDL_GL_STENCIL_SIZE, 8], [SDL_GL_RED_SIZE, 8], [SDL_GL_GREEN_SIZE, 8], [SDL_GL_BLUE_SIZE, 8], [SDL_GL_ALPHA_SIZE, 8], [SDL_GL_FRAMEBUFFER_SRGB_CAPABLE, 1]

SDL Window Creation Hints (Second Test): [SDL_GL_CONTEXT_MAJOR_VERSION, 3], [SDL_GL_CONTEXT_MINOR_VERSION, 2]

SDL Window Creation Hints (Third Test): 0

Window Creation Function: SDL_CreateWindow
Context Creation Function used: SDL_GL_CreateContext

These leaks do not happen in a similar api to SDL2 like GLFW, and all other applications work as it supposed to, noting this to clarify that there is no issue with the hardware or the computer directly.
We have tested all these in 2 days, and so far, no solution has yet.
We tried to use OpenGL debugging, Visual Studio Debugging, RenderDoc, and memory profilers, but none shows anything. Even in RenderDoc, the process does not seem to have any leaks when started under it.

Any help would be appreciated,
Thanks.

MTuner could help to figure out who allocates all that memory: GitHub - milostosic/MTuner: MTuner is a C/C++ memory profiler and memory leak finder for Windows, PlayStation 4 and 3, Android and other platforms
You might need debugging symbols of at least your application and SDL for it to work properly.

Also, if you have a small sample to reproduce the issue, sharing it here could help.


The MTuner shows the following result in the attached image.
And the tests are done in C# (using a native-binder library), the functions we call from C# are directly headed to SDL2, so the leaks are not caused by C# code but if I get the time, I will share a sample code to help you reproduce the issue.

If I don’t, it can be reproduced just as easily as by spawning two windows in the same thread, and using SDL_GL_MakeCurrent + SDL_GL_SwapWindow or any equivalent.

Looks like whatever causes your high memory usage isn’t tracked by MTuner (it claims that the peak usage was about 45MB)


While the MTuner claims it’s around 45MB (just as I said in my explanation above, that Visual Studio Debugger said the process only uses around 45MB), the task manager and the OS reports it as like in the image, and if I keep it running this way, the DWM crashes, or the entire OS freezes due to out of memory.

I have a multiple-window demo (it creates five OpenGL windows and then periodically updates them using SDL2 functions, not direct OpenGL calls) and I’m not seeing any evidence of a memory leak in Windows with SDL 2.0.22.

I have tested it in C and C# again, but the leak somehow disappeared by itself, it happens now and then differently every time I execute the program (e.g. happens one time, but the next start it doesn’t).

However, there is still an issue, when I spawn 2 window with SDL2, the VSYNC gets turned off and the fps becomes random like 140-150 or 90. When I comment out one of these windows’, the VSYNC comes back and the fps stabilizes in my refresh rate.

Even if I turn off VSYNC, with one Window I get over 2k FPS, but when I have two window, it drops to 100-150. Any idea why this happen?

Sample code:

#include <iostream>
#include <time.h>
#include <stdlib.h>

#pragma comment (lib, "SDL2")
#include <SDL.h>
#undef main

#pragma comment (lib, "opengl32")
#include <SDL_opengl.h>

#define WIDTH 1024
#define HEIGHT 768
#define POS_X SDL_WINDOWPOS_UNDEFINED
#define POS_Y SDL_WINDOWPOS_UNDEFINED

#define LOG(adfhahdf) std::cout << adfhahdf << std::endl;

SDL_Window* wnd1;
SDL_Window* wnd2;

SDL_GLContext glc1;
SDL_GLContext glc2;

void Render(SDL_Window* window, SDL_GLContext ctx);

int main()
{
    srand((unsigned int)time(NULL));

    LOG("Initializing SDL2...");
    if (SDL_Init(SDL_INIT_EVENTS | SDL_INIT_VIDEO) < 0)
    {
        LOG("Cannot initalize SDL2.");
        return 0;
    }

    LOG("Creating window 1...");

    wnd1 = SDL_CreateWindow("Test Window 1", POS_X, POS_Y, WIDTH, HEIGHT, SDL_WINDOW_SHOWN | SDL_WINDOW_OPENGL | SDL_WINDOW_RESIZABLE);
    if (!wnd1)
    {
        LOG("Cannot create window 1.");
        return 0;
    }

    LOG("Creating context 1...");

    SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 1);
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1);

    glc1 = SDL_GL_CreateContext(wnd1);

    if (!glc1)
    {
        LOG("Cannot create context 1.");
        return 0;
    }

    SDL_GL_SetSwapInterval(1);

    LOG("Creating window 2...");

    wnd2 = SDL_CreateWindow("Test Window 2", POS_X, POS_Y, WIDTH, HEIGHT, SDL_WINDOW_SHOWN | SDL_WINDOW_OPENGL | SDL_WINDOW_RESIZABLE);
    if (!wnd2)
    {
        LOG("Cannot create window 2.");
        return 0;
    }

    LOG("Creating context 2...");

    SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 1);
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 1);

    glc2 = SDL_GL_CreateContext(wnd2);

    if (!glc2)
    {
        LOG("Cannot create context 2.");
        return 0;
    }

    SDL_GL_SetSwapInterval(1);

    while (true)
    {
        SDL_Event e;
        while (SDL_PollEvent(&e))
        {
            // ...
        }

        Render(wnd1, glc1);
        Render(wnd2, glc2);
    }

    LOG("Shutting down...");

    SDL_GL_DeleteContext(glc1);
    SDL_DestroyWindow(wnd1);

    SDL_GL_DeleteContext(glc2);
    SDL_DestroyWindow(wnd2);

    SDL_Quit();

    return 0;
}

int testrand(int min, int max)
{
    int retval = rand() % max + 1;
    return retval < min ? min : retval;
}

void Render(SDL_Window* window, SDL_GLContext ctx)
{
    SDL_GL_MakeCurrent(window, ctx);

    int w, h;
    SDL_GetWindowSize(window, &w, &h);

    glViewport(0, 0, w, h);
    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();
    glOrtho(0, w, h, 0, -1, +1);
    glMatrixMode(GL_MODELVIEW);
    glLoadIdentity();

    glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
    glClear(GL_COLOR_BUFFER_BIT);

    glBegin(GL_LINES);
    glColor3ub(255, 255, 255);
    glVertex2f(10, 10);

    float v1 = (float)testrand(100, 500);
    float v2 = (float)testrand(100, 500);
    glVertex2f(v1, v2);
    glEnd();

    SDL_GL_SwapWindow(window);
}

Note: As I said above that the leak happens from time to time, when it happens it jumps to 200MB then back to 60MB, then jumps to 300MB and it works that way in that sample too (at times).

Code is not cleaned, I just coded quickly to show, so do not mind small mistakes that aren’t related to memory leak in the runtime.

I’m updating my windows sequentially in a round-robin fashion, so I’m deliberately only setting the SDL_RENDERER_PRESENTVSYNC flag on one of them (the ‘main’ window). That has the expected effect of limiting the overall frame rate to nominally 60 fps on this PC.

If I change the code so that SDL_RENDERER_PRESENTVSYNC is set on all five windows (or their renderers, to be precise) the frame rate falls, but not to 12 fps as one might expect. So you may be right that VSYNC is behaving a bit oddly.

I have finally found out the cause of the memory leak.
It is caused by MSI Afterburner’s OSD (RivaTunerStatisticsServer). Seem to be leaking in their kernel-mode driver.

See the result below:

The leak happens everywhere as long as you have the OSD of MSI Afterburner (even in VM).
Thanks to everyone for their replies.

Although this does not seem to solve the VSYNC issue, still unsure why VSYNC behaves oddly with multiple window.

@rtrussell for me the framerate gets higher than my refresh rate (75hz on the first screen). If I open 2 window, the VSYNC stabilizes the fps at 150 fps (like doubling it), if I open 3 window, it is 225 FPS in each window.