Double free or corruption when writing to SDL_Surface

Hi,
I’m migrating a c++ application build with SDL1.2 to SDL2.0.20 (Linux Mint).
I have a Screen class which holds a uint8 *pixels_ that is used to prepare the screen every frame.
And then in a SystemSDL class, I have an updateScreen() method, called every frame, to copy the Screen::pixels_ data to the back buffer and present the screen.
Here is an extract of the code:

bool SystemSDL::initialize(bool fullscreen) {
...
//pScreenSurface_ and pScreenTexture_ are members of SystemSDL
pScreenSurface_ = SDL_CreateRGBSurface(0, GAME_SCREEN_WIDTH, GAME_SCREEN_HEIGHT, 8, 0, 0, 0, 0);
pScreenTexture_ = SDL_CreateTexture(pRenderer_,
                                            SDL_PIXELFORMAT_ARGB8888,
                                            SDL_TEXTUREACCESS_STREAMING,
                                            GAME_SCREEN_WIDTH, GAME_SCREEN_HEIGHT);
...
}

void SystemSDL::updateScreen() {
    if (g_Screen.dirty()) {
        // Clear screen buffer
        SDL_RenderClear(pRenderer_);

        SDL_LockSurface(pScreenSurface_);

        const uint8 *srcPixels = g_Screen.pixels();
        Uint32 *dstPixels = (Uint32 *)pScreenSurface_->pixels;

        // Manual blitting to convert from 8bpp palette indexed values to 32bpp RGB for each pixel
        uint8 r, g, b;
        for (int i = 0; i < GAME_SCREEN_WIDTH * GAME_SCREEN_HEIGHT; i++) {
            uint8 index = srcPixels[i];

            r = pScreenSurface_->format->palette->colors[index].r;
            g = pScreenSurface_->format->palette->colors[index].g;
            b = pScreenSurface_->format->palette->colors[index].b;

            Uint32 c = ((r << 16) | (g << 8) | (b << 0)) | (255 << 24);

            dstPixels[i] = c;
        }

        SDL_UnlockSurface(pScreenSurface_);

        // Copy the pixels to the texture
        SDL_UpdateTexture(pScreenTexture_, NULL, pScreenSurface_->pixels, GAME_SCREEN_WIDTH * sizeof (Uint32));

        // Copy texture to the screen buffer
        SDL_RenderCopy(pRenderer_, pScreenTexture_, NULL, NULL);

        // Flip screen
        SDL_RenderPresent( pRenderer_ );
    }
}

SystemSDL::~SystemSDL() {
    if (pScreenSurface_) {
        SDL_FreeSurface(pScreenSurface_);
        pScreenSurface_ = nullptr;
    }

    if (pScreenTexture_) {
        SDL_DestroyTexture(pScreenTexture_);
        pScreenTexture_ = nullptr;
    }

    if (pRenderer_) {
        SDL_DestroyRenderer(pRenderer_);
        pRenderer_ = nullptr;
    }

    if (pWindow_) {
        SDL_DestroyWindow( pWindow_ );
        pWindow_ = nullptr;
    }

    SDL_Quit();
}

The problem I have, is that when I exit the application I have a “double free or corruption (!prev)” error.
I have found a solution that doesn’t generate this error. It’s to use a Uint32 *pixels_ array as the destination array for pixel transformation.

void SystemSDL::updateScreen() {
    if (g_Screen.dirty()|| (cursor_visible_ && update_cursor_)) {
        // Clear screen buffer
        SDL_RenderClear(pRenderer_);

        SDL_LockSurface(pScreenSurface_);

        const uint8 *srcPixels = g_Screen.pixels();
        //Uint32 *dstPixels = (Uint32 *)pScreenSurface_->pixels;

        // We do manual blitting to convert from 8bpp palette indexed values to 32bpp RGB for each pixel
        uint8 r, g, b;
        for (int i = 0; i < GAME_SCREEN_WIDTH * GAME_SCREEN_HEIGHT; i++) {
            uint8 index = srcPixels[i];

            r = pScreenSurface_->format->palette->colors[index].r;
            g = pScreenSurface_->format->palette->colors[index].g;
            b = pScreenSurface_->format->palette->colors[index].b;

            Uint32 c = ((r << 16) | (g << 8) | (b << 0)) | (255 << 24);

            pixels_[i] = c;
        }

        SDL_UnlockSurface(pScreenSurface_);

        // Copy the pixel to the texture
        SDL_UpdateTexture(pScreenTexture_, NULL, pixels_, GAME_SCREEN_WIDTH * sizeof (Uint32));
...

In that case, everything works fine. But then I don’t use the pScreenSurface_ other than for storing the palette. And I have 2 memory spaces for the same thing.
If you have any idea, it would be nice to share!
Thanks!

This line looks suspicious:

SDL_UpdateTexture(pScreenTexture_, NULL, pScreenSurface_->pixels, GAME_SCREEN_WIDTH * sizeof (Uint32));

pScreenTexture_ uses SDL_PIXELFORMAT_ARGB8888 while pScreenSurface_ uses SDL_PIXELFORMAT_INDEX8 but the docs for SDL_UpdateTexture says:

The pixel data must be in the pixel format of the texture.

I also don’t think you should assume that the pitch is surface-width × pixel-size. There might be additional padding for each row. You can use pScreenSurface_->pitch to get the pitch of the surface.

Hi,
Thanks for your answer.
I agree that there’s something not consistent between the INDEX8 surface and the ARGB8888 texture.
But I tried to create the texture using SDL_CreateTextureFromSurface() and the given surface. I got a strange display (the result is displayed in 4 columns :frowning: ) and I still got the “double free” message.
I also tried to create the surface with a 32 bits depth to match the texture format. But then I have a failure when loading a palette to the surface (which works well with the INDEX8 format).
Maybe I have to create 2 surfaces : one in INDEX8 and one in ARGB8888 to update the texture.

This should work if you’re doing it every time but if you’re still calling SDL_UpdateTexture you will have the same problem as before because the texture will not necessarily be the same format as the surface, the docs says so:

The pixel format of the created texture may be different from the pixel format of the surface.


One common cause of “double free” in C++ is when accidentally copying objects that free something in the destructor but doesn’t implement the copy constructor and copy assignment operator to handle copying correctly. This leads to two objects pointing to the same thing and when the destructors gets called it gets freed twice.

If SystemSDL objects are not meant to be copied you might want to disable the copy constructor and copy assignment operator by defining them as deleted so that you get a nice compilation error instead of a runtime error if you accidentally copy it somewhere.

class SystemSDL
{
	...
	SystemSDL(const SystemSDL&) = delete;
	SystemSDL& operator=(const SystemSDL&) = delete;
	...
};

Otherwise, my general advice with these sort of problems is to use “sanitizers” (e.g. -fsanitize=address,undefined if you’re using GCC or Clang) or valgrind (not available on Windows). I think these tools are invaluable and can sometimes spot problems that I didn’t even knew I had.


I don’t think indexed formats (that use palettes) ever uses more than 8-bits per pixel in SDL. Even if SDL supported it it would still be a different format than a non-indexed format.

After having looked at your code more closely I’m not so sure you actually want pScreenSurface_ to have a palette at all. The code inside the loop looks wrong. Normally the indices would come from the same surface as the palette belongs to but here you’re getting the indices from g_Screen and using them to look up colours in pScreenSurface_'s palette and then it looks like you’re trying to write to it using a 32-bit format at the same time even though it’s only a 8-bit format.

So maybe the solution is indeed to make pScreenSurface_ the same 32-bit format as the texture:

pScreenSurface_ = SDL_CreateRGBSurfaceWithFormat(0, GAME_SCREEN_WIDTH, GAME_SCREEN_HEIGHT, 32, SDL_PIXELFORMAT_ARGB8888);

and change so that you use g_Screen’s palette instead of pScreenSurface_'s palette to look up the colours:

r = g_Screen->format->palette->colors[index].r;
g = g_Screen->format->palette->colors[index].g;
b = g_Screen->format->palette->colors[index].b;

(this will obviously not compile but you get the idea)

Hi,

To give some insights on my code:

  • SystemSDL is a singleton class that encapsulates calls to SDL
  • The Screen class is also a singleton (g_Screen) that has an array of 640 x 400 pixels
  • Every drawing of sprites, tiles, fonts is made in this array
  • The palette is loaded from a file and store in the pScreenSurface palette
  • then before presenting the screen, I copy all pixel from g_Screen to the texture and then copy the texture.

Following your inputs:
At initialization time:

  • The surface is still an INDEX8 surface :
    pScreenSurface_ = SDL_CreateRGBSurface(0, Screen::kScreenWidth, Screen::kScreenHeight, 8, 0, 0, 0, 0);
  • I create a texture from this surface:
    pScreenTexture_ = SDL_CreateTextureFromSurface(pRenderer_, pScreenSurface_);

For each frame:

void SystemSDL::updateScreen() {
//...
        const uint8 *srcPixels = g_Screen.pixels();

        memcpy (pScreenSurface_->pixels, srcPixels, Screen::kScreenWidth * Screen::kScreenHeight);

        SDL_UnlockSurface(pScreenSurface_);

        // Copy the pixel to the texture
        SDL_UpdateTexture(pScreenTexture_, NULL, pScreenSurface_->pixels, pScreenSurface_->pitch);

        // Copy texture to the screen buffer
        SDL_RenderCopy(pRenderer_, pScreenTexture_, NULL, NULL);

        // Flip screen
        SDL_RenderPresent( pRenderer_ );
}

Results:

  • First, I don’t have the double free or corruption anymore
  • settting the palette works because the surface is INDEX8
    But:
  • The content of the surface/texture is displayed but 4 times horizontally (see the image).
    It seems to have something with the pitch I pass to SDL_UpdateTexture
  • The colors are not displayed as usual

Do you have any idea how to copy properly the screen pixels to the surface?

You still have the problem that the pixels that you pass to SDL_UpdateTexture might not have the same format as the texture that you’re updating.

You can use SDL_ConvertSurface or SDL_ConvertSurfaceFormat to convert a surface to whatever pixel format you need.

// Get the pixel format of the texture
Uint32 textureFormat;
SDL_QueryTexture(pScreenTexture_, &textureFormat, NULL, NULL, NULL);

// Convert the surface to the same format as the texture 
SDL_Surface* tmpSurface = SDL_ConvertSurfaceFormat(pScreenSurface_, textureFormat, 0);

// Update the texture
SDL_UpdateTexture(pScreenTexture_, NULL, tmpSurface->pixels, tmpSurface->pitch);

// clean up
SDL_FreeSurface(tmpSurface);

I’m not sure this is the most efficient way to do it but at least it seems to work when I test it.