Hi everyone. I have a project I’m working on that’s using SDL as
(among other things) a front-end to another application. The SDL window
disappears when this other application is launched. I’ve been using
Java along with an SDL JNI binding that I wrote a little while back. To
make the SDL window go away temporarily, I simply quit SDL - then when I
need the window to open up again, I re-initialize SDL and reload all of
my graphics, etc.
The problem is that when SDL_Quit is called, it also calls
SDL_UninstallParachute, even though I initialize SDL with the
no-parachute flag. This is fine, since the code seems to attempt to put
back any signal handlers that were already installed (unless the
previous handler was SDL_Parachute).
However, from my experience with what I'm doing now, this doesn't
exactly…well…work. It appears that the JVM is installing its own
signal handlers, in particular a seg fault handler. So the following
(very stupid) Java code will throw a NullPointerException:
int [][] dumbass = new int[32][];
int temp = dumbass[0][0];
If I run the above code prior to calling SDL_Quit(), it throws the
exception. However, if I run the code after calling SDL_Quit(), the JVM
explodes, throwing shrapnel in all directions.
If I comment out SDL_UninstallParachute() in the SDL sources and
recompile, then everything works fine. Note that if I make
SDL_UninstallParachute() set all of the signal handlers back to SIG_DFL,
then the JVM simply seg faults relatively quietly, rather than exploding
into little chunks. I’m not 100% sure what the difference might be -
but was hoping someone would have a theory.
Anyone have any thoughts on this? I'm happy to write a simple sample
program or post code if it will help. I have an easy workaround for the
problem, which is to call SDL_QuitSubsystem(SDL_INIT_EVERYTHING) instead
of SDL_Quit in my JNI code - but I’m just really curious about this and
can’t figure it out.
Thanks!
Eric Wittmann
wittmann at snet.net