Migration guide errors / comments

Congrats on the release of 2.0!

Anyway… a few comments / issues.

One, let’s say we have this from our old code:

Code:
screen = SDL_SetVideoMode(width, height, bpp, flags);

it is suggested that the new way is to do:

Code:
screen = SDL_CreateWindow(“My Game Window”, SDL_WINDOWPOS_UNDEFINED, SDL_WINDOWPOS_UNDEFINED, width, height, flags);

This is not really the same, since in the old API call, we specified a bpp (bit depth), and the new way don’t have that option at all.

We don’t call SDL_ListModes() anymore. There’s an equivalent function in SDL2, but instead we’re going to use a new feature called

You should mention what SDL_ListModes() maps to, since people still need to know what are the available resolutions / bit depths the user has, so we can populate a menu with the correct choices. I don’t see a easy way to tell what bit depth is available now, besides checking the current.format from a call to SDL_GetCurrentDisplayMode(0, &current); but current.format is much more fine grain compared to just bit depth.

The good news: the 1.2 SDL_Surface API mostly still exists. So change your screen surface from this:

Code:
SDL_Surface *screen = SDL_SetVideoMode(640, 480, 32, 0);

…to this…

Code:
// if all this hex scares you, check out SDL_PixelFormatEnumToMasks()!
SDL_Surface *screen = SDL_CreateSurface(0, 640, 480, 32,
0x00FF000,
0x0000FF00,
0x000000FF,
0xFF000000);

I assume you meant to type SDL_CreateRGBSurface(), the one you used is from SDL 1.2

On the input stuff, not sure about these, SDLK_RMETA, SDLK_RSUPER, SDLK_LMETA, SDLK_LSUPER… I assume the SDLK_R/LMETA maps to SDLK_R/LGUI but what does SDLK_RSUPER map to ?

And finally (well, not really, since I got more debugging to do on the conversion), you have:

Now there’s SDL_ShowSimpleMessageBox(). You’re welcome!

Yes!, finally! WooHoo!
Small problem, it isn’t listed anyplace, except the source.
http://wiki.libsdl.org/moin.fcg/CategoryAPI not found at all ?

Now, back to converting my app to make it work on SDL 2.0…

Le mercredi, 14 ao?t 2013 ? 01:45, Sparks a ?crit :

Quote:
Now there’s SDL_ShowSimpleMessageBox(). You’re welcome!

Yes!, finally! WooHoo!
Small problem, it isn’t listed anyplace, except the source.
http://wiki.libsdl.org/moin.fcg/CategoryAPI not found at all ?

Problems also happen the other way round there a few functions that don’t exist at all:

http://wiki.libsdl.org/moin.fcg/SDL_GetNumKeyboards
(#119 on this page http://wiki.libsdl.org/moin.fcg/CategoryAPI)

#15 JoystickOpened on this page
http://wiki.libsdl.org/moin.fcg/CategoryJoystick

There may be a few others that I didn’t write down (I’m writing bindings this is why I see this).

Despite the argued response in the other thread by Ryan I think it’s not really tenable to have this two headed documentation structure.

I have no problem with the idea that more general documentation (introduction, tutorials, etc) is in a user contributed in a wiki. I still think that reference documentation should be provided by the devs themselves, in header generated documentation, as they are the most able to accurately describe the semantics of functions and their platform limitations.

Third party contributions in this reference documentation could also be made easier by moving to a more contributor friendly workflow (pull-request like, e.g. bitbucket that should support hg or the git alternatives like gitlab or github or others that I don’t know).

Best,

Daniel

This was great feedback, thanks!

This is not really the same, since in the old API call, we specified a
bpp (bit depth), and the new way don’t have that option at all.

The way that guide explains it gives you the default bit depth of the
desktop (which is probably always 32 on modern machines) and lets the
render API work out the differences. It’s better to not change the depth
of the desktop, honestly, and let the GPU convert your game’s output to
what the user is already using.

The way you can choose a display bit depth is more complicated in
SDL2, as opposed to the 1.2 SDL_SetVideoMode() call. You can try
SDL_GetClosestDisplayMode(), and if it finds something and you like what
it finds, you can use SDL_SetWindowDisplayMode(), but this only changes
the bit depth when your window is fullscreen.

Trust me, you don’t want to change the bit depth anymore. It was really
important for software renderers, both to get a speedup at lower bit
depths and just to make sure the display knew what the framebuffer
memory layout was, but the GPU changes the game entirely, and SDL2 wants
you to take advantage of that.

You should mention what SDL_ListModes() maps to, since people still need
to know what are the available resolutions / bit depths the user has, so
we can populate a menu with the correct choices. I don’t see a easy way
to tell what bit depth is available now, besides checking the
current.format from a call to SDL_GetCurrentDisplayMode(0, ?t); but
current.format is much more fine grain compared to just bit depth.

The replacement for SDL_ListModes() is to use SDL_GetNumDisplayModes(),
and then enumerate with SDL_GetDisplayMode(). I’ll link to them in the
wiki page. We sort of cheat in the guide by promoting fullscreen-desktop
mode here.

I assume you meant to type SDL_CreateRGBSurface(), the one you used is
from SDL 1.2

Whoops, yes, that should be SDL_CreateRGBSurface…fixed now.

On the input stuff, not sure about these, SDLK_RMETA, SDLK_RSUPER,
SDLK_LMETA, SDLK_LSUPER… I assume the SDLK_R/LMETA maps to SDLK_R/LGUI
but what does SDLK_RSUPER map to ?

(I could be wrong about this, but…) this is all SDLK_L/RGUI now. Some
targets called this SDLK_L/RSUPER in 1.2, everyone else called this key
SDLK_L/RMETA (although X11 claimed it could report both, it never did as
far as I know).

Yes!, finally! WooHoo!
Small problem, it isn’t listed anyplace, except the source.
http://wiki.libsdl.org/moin.fcg/CategoryAPI not found at all ?

Yeah, it’s an oversight. I’ll fill that in when I get a chance.

Now, back to converting my app to make it work on SDL 2.0…

Good luck, and thanks for reading the guide!

–ryan.

2013/8/15, Ryan C. Gordon :

It’s better to not change the depth
of the desktop, honestly, and let the GPU convert your game’s output to
what the user is already using.

Not just that, on X11 you outright can’t change the color depth, not
even with stuff like XRandR. The only way to do it involves restarting
the server (which in turn means shutting down any programs running on
it, i.e. pretty much every program with any sort of window).

Trust me, you don’t want to change the bit depth anymore. It was really
important for software renderers, both to get a speedup at lower bit
depths and just to make sure the display knew what the framebuffer
memory layout was, but the GPU changes the game entirely, and SDL2 wants
you to take advantage of that.

Actually, this may become a thing if we end up with 64-bit color
depths in consumer systems in the future. Of course, nothing prevents
the GPU from rendering to a 32-bit buffer then showing it in a 64-bit
mode (which would be pretty trivial for the hardware probably), but
even in that case there’s no reason for SDL not making it easy to
perform that task.

We’re still not there though, so I guess there isn’t much of a point
right now…