Migrating from SDL 1.2 to SDL2: some problems

Hello,

i’m trying to migrate a software from SDL1.2 to SDL2. I had a look at the migration guide, but in some case i wasn’t able to
subtitute old functions with the new ones offered by SDL2.

I kindly ask the community support to solve some problem i’m encountering while porting some programs.

Just to start, i have the following issue.

this line is ok for SDL1, but for SDL2 i get: dereferencing pointer to incomplete type struct

Code:

fullscreen = ((screen->flags & SDL_FULLSCREEN) == SDL_FULLSCREEN);

any hint?

This is another point where i’m stuck atm.
What if the previuos code for SDL1 uses SDL_DOUBLEBUF or as in this case SDL_HWPALETTE?

Code:

if (bitdepth == 8)
uiSDLVidModFlags |= SDL_HWPALETTE;
if (fullscreen) {
uiSDLVidModFlags |= SDL_FULLSCREEN | SDL_HWSURFACE;
if (!screen_is_picasso && currprefs.gfx_vsync)
uiSDLVidModFlags |= SDL_DOUBLEBUF;
}

Thank you very much.
Kind regards
[/code]

1 Like

What is your value for screen? SDL_Window
https://wiki.libsdl.org/SDL_GetWindowFlags should contain your flags for
checking whether or not you are in fullscreen.

In terms of double buffer, preset vsync
https://wiki.libsdl.org/SDL_RendererFlags?highlight=(\bCategoryEnum\b)|(CategoryRender)
on your renderer is probably what you want.

I haven’t had to port any code using SDL_HWPALETTE from SDL 1.x, so I’m not
sure if there is a translation or not, someone else will have to pipe in on
that.On Fri, Apr 22, 2016 at 5:18 AM, AmigaBlitter wrote:

Hello,

i’m trying to migrate a software from SDL1.2 to SDL2. I had a look at the
migration guide, but in some case i wasn’t able to
subtitute old functions with the new ones offered by SDL2.

I kindly ask the community support to solve some problem i’m encountering
while porting some programs.

Just to start, i have the following issue.

this line is ok for SDL1, but for SDL2 i get: dereferencing pointer to
incomplete type struct

Code:

fullscreen = ((screen->flags & SDL_FULLSCREEN) == SDL_FULLSCREEN);

any hint?

This is another point where i’m stuck atm.
What if the previuos code for SDL1 uses SDL_DOUBLEBUF or as in this case
SDL_HWPALETTE?

Code:

if (bitdepth == 8)
uiSDLVidModFlags |= SDL_HWPALETTE;
if (fullscreen) {
uiSDLVidModFlags |= SDL_FULLSCREEN | SDL_HWSURFACE;
if (!screen_is_picasso && currprefs.gfx_vsync)
uiSDLVidModFlags |= SDL_DOUBLEBUF;
}

Thank you very much.
Kind regards
[/code]


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

fullscreen = ((screen->flags & SDL_FULLSCREEN) == SDL_FULLSCREEN);

You want this:

fullscreen = (SDL_GetWindowFlags(screen) & SDL_WINDOW_FULLSCREEN) != 0;

(Note that this will work whether you used SDL_WINDOW_FULLSCREEN or
SDL_WINDOW_FULLSCREEN_DESKTOP, since they both set the former’s bit.

This is another point where i’m stuck atm.
What if the previuos code for SDL1 uses SDL_DOUBLEBUF or as in this case
SDL_HWPALETTE?

SDL_HWPALETTE doesn’t mean much in modern times; you can just drop it.
Presumably you have a 32-bit display, so if you give SDL an 8-bit
surface, it’ll just convert it to display format as appropriate and not
force a hardware palette that would mess up other windows on the screen.

SDL_DOUBLEBUF doesn’t mean much now either (you’re probably
double-buffered everywhere now), but if you’re using the SDL2 Render API
to get your surface to the screen, you can create the renderer with the
SDL_RENDERER_PRESENTVSYNC to make sure there’s no tearing. If you’re
eventually drawing with OpenGL, you can use SDL_GL_SetSwapInterval(1)
after creating your GL context to the same effect.

–ryan.

1 Like