To prevent SDL from using its window

Hello!

I am really new to SDL, so please bear with me…

I am making a DLL extension for a piece of 2D game coding software (called
Fenix). My extension is a 3D engine using opengl.

Fenix uses SDL…so that’s why i’m here. Fenix uses software blitting onto a
hardware SDL_Surface (called “screen”) set using SDL_setvideomode.

In my DLL, I take over the window to use for opengl. (I would then like to grab
the data from the “screen” surface, and send it to opengl as a texture, so that
I can still display Fenix’s software drawing, but from within my 3D engine.)

However, I don’t know how to stop SDL from playing with the window.

If I re-set “screen” as a memory surface using SDL_CreateRGBSurface I can see my
opengl!! (this was quite a breakthrough for me…)

However, I think SDL is still somehow attached to the window, as I see some
weird things in my opengl, like the depth buffer is still being modified or
something…polygons appearing in front of closer ones etc…

The only time I can get my opengl to display normally is when Fenix does
absolutely no software drawing (and therefore probably does not call sdl
update or flip functions).

Sorry for the long explanation…can anyone help me?

Thanks.

[…]

However, I don’t know how to stop SDL from playing with the window.

Well, you can’t, basically.

However, the display surface is (more or less) just another surface,
as seen from the application - so you can just create a software
shadow surface and hand that to the 2D part of the application, and
have it use that instead of the display surface.

If I re-set “screen” as a memory surface using SDL_CreateRGBSurface
I can see my opengl!! (this was quite a breakthrough for me…)

Almost there… :slight_smile:

However, I think SDL is still somehow attached to the window, as I
see some weird things in my opengl, like the depth buffer is still
being modified or something…polygons appearing in front of closer
ones etc…

Hmm… How do you create the display surface? Which SDL calls is the
2D part of the application using? If it’s just blitting to the s/w
shadow surface you gave it, it shouldn’t mess with the OpenGL state.

Are you by any chance using glSDL for the 2D rendering? That would
probably screw up your OpenGL state whenever someone calls
SDL_Flip()…

//David Olofson - Programmer, Composer, Open Source Advocate

.- Audiality -----------------------------------------------.
| Free/Open Source audio engine for games and multimedia. |
| MIDI, modular synthesis, real time effects, scripting,… |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Sunday 20 March 2005 00.35, Daniel wrote:

Sorry this is a double post…
I thought it hadn’t gone through (it took 14 hours!).

David Olofson <david olofson.net> writes:

Hmm… How do you create the display surface? Which SDL calls is the
2D part of the application using? If it’s just blitting to the s/w
shadow surface you gave it, it shouldn’t mess with the OpenGL state.

In Fenix (I can’t modify this), it’s created using:

sdl_flags = SDL_HWPALETTE;
if (double_buffer)
	sdl_flags |= SDL_DOUBLEBUF;
if (full_screen)
	sdl_flags |= SDL_FULLSCREEN;
if (hardware_scr)
	sdl_flags |= SDL_HWSURFACE;
else
	sdl_flags |= SDL_SWSURFACE;
if (frameless)
	sdl_flags |= SDL_NOFRAME;

screen = SDL_SetVideoMode (width, height, 
	  ((enable_16bits || enable_2xscale) ? 16:8), sdl_flags);

It uses 16 bits, i’m not sure what the “2xscale” is, but I don’t think it is
used.
I think the only flags it uses are SDL_DOUBLEBUF, and SDL_HWSURFACE…
Although I am not sure of the difference between SDL_HWSURFACE and
SDL_SWSURFACE. If it is SDL_SWSURFACE, then it is only in memory and not
visible?

In my DLL, I reset screen using:

screen = SDL_CreateRGBSurface(SDL_HWSURFACE, 1024, 768, 16, screen->rmask…etc
(using masks from previous screen)

(I don’t have access to my exact code at the moment)

Fenix uses variations on these, depending on how it’s set up: (sometimes it
calls SDL_updaterects I think)

SDL_UpdateRect (screen, 0, 0, 0, 0) ;
SDL_Flip(screen) ;

I can search the Fenix sources for any other functions which might cause the
problems.

Are you by any chance using glSDL for the 2D rendering? That would
probably screw up your OpenGL state whenever someone calls
SDL_Flip()…

no SDL gl functions are used by fenix. only software access to the surface data.

thanks,
Daniel

[…]

Although I am not sure of the difference between SDL_HWSURFACE and
SDL_SWSURFACE. If it is SDL_SWSURFACE, then it is only in memory and
not visible?

Well, sort of; if it’s a s/w surface, it’s in system RAM, and a s/w or
DMA blit into VRAM is required to make it visible.

If you ask for a h/w surface, you may get a surface with it’s pixels
array allocated in VRAM, texture space or whatever applies to the
backend in use.

A h/w display surface is (normally) an actual display page, meaning
that no blit at all is required to make it visible; just a switch of
hardware pointers. (Done via SDL_Flip(), retrace sync’ed where
possible.)

In my DLL, I reset screen using:

screen = SDL_CreateRGBSurface(SDL_HWSURFACE, 1024, 768, 16,
screen->rmask…etc
(using masks from previous screen)

…with SDL_OPENGL added? Or are you bypassing SDL and creating the
OpenGL context some other way?

[…]

Are you by any chance using glSDL for the 2D rendering? That would
probably screw up your OpenGL state whenever someone calls
SDL_Flip()…

no SDL gl functions are used by fenix. only software access to the
surface data.

Well, glSDL is not OpenGL. It’s an implementanion of the SDL 2D API
over OpenGL. Two implementations, actually;

glSDL/wrapper (old, deprecated):
If Fenix is using this, it means it’s compiled with a wrapper that
redefines some SDL calls, pointing them to alternative, compiled-in
implementations. The only difference in the source would be that
glSDL.h is included instead of SDL.h, and there’s a flag SDL_GLSDL
passed to (the wrapped) SDL_SetVideoMode() to enable OpenGL
acceleration.

glSDL/backend:
If you were using an SDL lib with glSDL compiled in (not yet in
official SDL releases), it wouldn’t even affect the Fenix binary. You
could tell SDL to activate the glSDL backend by means of an
environment variable without Fenix even being aware of it, and Fenix
would happily (hopefully) use OpenGL like any other SDL backend.

Anyway, the problem is that you’re not really supposed to use the SDL
2D API (whether it’s wired to glSDL, or any other backend) when
you’re using OpenGL. I know that glSDL will mess with the OpenGL
state (I created it :-), but I suspect that other 2D backends may do
weird stuff as well, when you, or in this case, Fenix, calls
functions that are not supposed to use in “OpenGL mode”.

I’m actually surprized SDL doesn’t plain crash. (It used to do that
when using SDL_BlitSurface() to an OpenGL display and the like.)

I’m not sure it’s at all possible to handle an application calling SDL
2D calls while some DLL “cheats” SDL into switching to an OpenGL
display. I’m afraid you’ll have to find some way to keep Fenix away
from the display surface, meaning no calls to SDL_UpdateRect(s)(),
SDL_Flip() and maybe some other calls I’m forgetting. Or maybe modify
SDL to deal with it somehow; forward the offending calls to your DLL
or something…

//David Olofson - Programmer, Composer, Open Source Advocate

.- Audiality -----------------------------------------------.
| Free/Open Source audio engine for games and multimedia. |
| MIDI, modular synthesis, real time effects, scripting,… |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Monday 21 March 2005 10.38, Daniel wrote:

David Olofson <david olofson.net> writes:

In my DLL, I reset screen using:

screen = SDL_CreateRGBSurface(SDL_HWSURFACE, 1024, 768, 16,
screen->rmask…etc
(using masks from previous screen)

Just realised a mistake I made…sorry (I have access to my code now!)

newsurface=SDL_CreateRGBSurface(SDL_SWSURFACE,
screen->w, screen->h, screen->format->BitsPerPixel,
screen->format->Rmask, screen->format->Gmask, screen->format->Bmask,
screen->format->Amask)
screen=newsurface;

In that function I do in fact create a SW surface not HWSURFACE as I said…

…with SDL_OPENGL added? Or are you bypassing SDL and creating the
OpenGL context some other way?

I don’t (knowingly) use any SDL opengl wrapper functions, I use the same method
for calling opengl as in a normal c++ windows program (getting the current DC,
choosing a pixelformat and using wglCreateContext and wglMakeCurrent)
Also that’s another thing…I thought pixelformat could only be set once (and
would already have been set by SDL)? But wglCreateContext only succeeds when I
create one.

glSDL/wrapper (old, deprecated):
If Fenix is using this, it means it’s compiled with a wrapper that
redefines some SDL calls, pointing them to alternative, compiled-in
implementations. The only difference in the source would be that
glSDL.h is included instead of SDL.h, and there’s a flag SDL_GLSDL
passed to (the wrapped) SDL_SetVideoMode() to enable OpenGL
acceleration.

I’m pretty sure Fenix uses no opengl whatsoever, Fenix definitely includes sdl.h
not glsdl, and i’ve searched the source for references to “gl” and found none.

Anyway, the problem is that you’re not really supposed to use the SDL
2D API (whether it’s wired to glSDL, or any other backend) when
you’re using OpenGL. I know that glSDL will mess with the OpenGL
state (I created it , but I suspect that other 2D backends may do

Sorry i’m not using your creation…yet anyway :slight_smile:

weird stuff as well, when you, or in this case, Fenix, calls
functions that are not supposed to use in “OpenGL mode”.

I don’t really understand this…from how I see it, Fenix (with SDL…) should
no longer change the screen after it is using a different surface. And, it seems
to work fine, apart from the strange messed up depth buffer…
knowing my luck I expect there is just some tiny SDL or Fenix thing I have
overlooked…

I will try some of your suggestions and some more experimenting when I have some
time (tomorrow hopefully).

Thanks for your help,
Daniel