Memory leak in SDL_putenv() - can it be plugged safely?

I’m using some of the VS2005 CRT debug functions to track potential
memory leaks in my program. To ensure that I catch all leaks, in my
testing debug build I’ve included SDL and all of the other libraries I
use in my “catch area” for CRT leak detection.

I’ve nicely wrapped all my SDL functionality into a C++ “surface” class.
The class has a global instance counter. The constructor makes a heap
snapshot when the instance count is zero (before incrementing). The
destructor does a heap dump (leak check) when the instance count is zero
(after decrementing). This all works nicely to detect leaks in my
program.

The snapshot of the heap is made before SDL_Init() in the construction.
The heap dump is made after SDL_Quit() in the destructor.

Now, I’ve noticed I get one and only one memory leak in SDL which is
caused by any call to SDL_putenv(). The memory allocated by
SDL_putenv() is never freed, even after SDL_Quit(). The SDL source code
even let me know this is expected, as in SDL_getenv.c I see:

static char SDL_envmem = NULL; / Ugh, memory leak */

When my program (in debug mode) exits, I want it to have ZERO memory
leaks. To work around the SDL leak, I check to see if I have one single
leak. If so, I exit cleanly without reporting any memory leaks.

Now here’s my question (finally)…

Is there any reason why SDL can’t free SDL_envmem on a call to
SDL_Quit()? Are there existing apps that would expect the environment
to be persistent after SDL_Quit() is called?

I can make this change locally for my use of SDL to make my app leak
free, so for my own use, it’s a no-brainer. I’m just wondering what
historical reasons exist (if any) that would prevent SDL_envmem from not
being freed when SDL_Quit is called.

Thanks,
Doug.

Is there any reason why SDL can’t free SDL_envmem on a call to
SDL_Quit()? Are there existing apps that would expect the environment
to be persistent after SDL_Quit() is called?

Yes, some apps set environment variables before SDL_Init(), to affect
initialization, and then during execution call SDL_Quit() / SDL_Init()
to re-initialize. This isn’t an especially good practice, but at this
point it’s not something I want to break gratuitiously in SDL 1.2.

Feel free to plug it in your copy though. :slight_smile:

See ya!
-Sam Lantinga, Lead Software Engineer, Blizzard Entertainment

In article , OurFavoritePirate :stuck_out_tongue:
says…

Is there any reason why SDL can’t free SDL_envmem on a call to
SDL_Quit()? Are there existing apps that would expect the environment
to be persistent after SDL_Quit() is called?

Yes, some apps set environment variables before SDL_Init(), to affect
initialization, and then during execution call SDL_Quit() / SDL_Init()
to re-initialize. This isn’t an especially good practice, but at this
point it’s not something I want to break gratuitiously in SDL 1.2.

Feel free to plug it in your copy though. :slight_smile:

My atexit() senses are tingling. If I plug this in my copy, I’m tempted to
free SDL_envmem in a registered atexit() handler. Since atexit() should never
be called directly by an app, it should survive the SDL_putenv(), SDL_Init(),
SDL_Quit(), SDL_Init() cycle.

I’m not sure if having SDL depend on atexit() would work on all platforms, but
if I use this plug in my copy I can solve my problem without breaking need to
have SDL_envmem persistent across quit/init cycles.

Since my CRT memory leak code “goes off” in the destructor of a static class,
I’m pretty sure said destructor is called AFTER all atexit() handlers.

Thanks,
Doug.

I’m not sure if having SDL depend on atexit() would work on all platforms

It doesn’t. If you dynamically load the library, it registers an atexit
handler, and then you unload the library, your application will crash since
the atexit handler doesn’t exist at exit.

Since my CRT memory leak code “goes off” in the destructor of a static class,
I’m pretty sure said destructor is called AFTER all atexit() handlers.

That behavior isn’t defined, it depends on the C++ runtime implementation.

All those issues aside, if it works for you, go for it! :slight_smile:

See ya,
-Sam Lantinga, Lead Software Engineer, Blizzard Entertainment