I’m currently trying to port some of our titles to OS X/Quartz using the
Cocoa layer (instead of the Carbon layer). As a reference point, our
games are successfully deployed on Linux using SDL, so I’m pretty
confident we’ve got most of the SDL bit down.
The first problem is the easiest one. BACKSPACE (and probably some
other keys) puts 127 into key.unicode in QZ_DoKey(). I don’t know if
this is expected behaviour, but under Linux I don’t believe this is the
case. I’ve been looking at key.unicode for ctrl-H for backspace. This
is easy enough to fix on the application end by looking at key.sym or,
if it helps with consistency, by putting key.sum into key.unicode if
key.unicode == 127 in QZ_DoKey() (or a similar type hack).
The second problem is significantly gnarlier, and has to do with the
following sequence:
- start app in fullscreen
- switch to windowed
Doing window -> fullscreen -> window or window -> fullscreen does not
manifest this problem.
My initial take is that this is a problem with Quartz/Cocoa, not SDL,
but I’ll describe it here just so others can take a look at it.
The initial problem was that when I was back in windowed mode, my mouse
"y" value was always 0. Further research showed it was always getting
clipped. Even further investigation showed that something really,
really wacky was going on.
To do some basic sanity checks, I called [[NSScreen mainScreen] frame]
first thing in SetVideoModeWindowed(), and it reported origin = 0,0 and
size = 640,480. Except I was running 1280x1024, and everything else
checked out and looked fine. Theory: CGDisplayRelease or a related call
isn’t “flushing” some state that NSScreen is referencing. It’s like
[NSScreen mainScreen] is updated on CGDisplaySetMode, but not updated on
when it’s unset.
So that was/is a problem.
Then, as a second sanity check, I called [qz_window frame] origin.y was
ranging from 0 to -screenheight, in other words, halfway down the screen
the window’s origin would be 0,-768 or something similar.
Cocoa uses an inverted coordinate system from what most people are used
to with 2D display devices. (0,0) is at the bottom left of the screen.
So what this means is that apparently qz_window thinks it’s below the
bottom of the screen. Or, another way of thinking about this, it thinks
that (0,0) is at the top left of the screen, but that +Y goes up.
As an adjunct to this, calling [qz_window convertScreenToBase] gives
bizarre results, but which are expected once the values are examined.
When things are working, for example, with a screen point of 0,500 and a
window origin of 0,200, “convertScreenToBase” should report an output of
0,300 (I think).
However, in my case, when stepping through, I find that I’ll have screen
coordinate of (0,200) and after calling [qz_window convertScreenToBase],
I’ll have a number like 0,600 because qz_window’s origin is at 0,-400
(!) even though it’s displayed correctly on the monitor. Bizarre.
I’m hesitant to say this is an SDL implementation bug, since it works
just fine in the other cases mentioned. It definitely feels like a
problem with Quartz. Of course, I’m sure Apple will blame SDL =/
I’ve posted to mac-games and cocoa-dev @lists.apple.com, but no answer
yet. Anyway, if someone has an idea what might be causing this, I would
love to hear any directions to investigate.
Thanks,
Brian