glSDL + mac os x

I added #define HAVE_OPENGL to the top of the glSDL demo and compiled so it would use
the opengl functions. When I run I get this error:

desired = 640 x 480
real = 640 x 480

glSDLTEST has exited due to signal 4 (SIGILL).

I know that is a pretty generic error, but are there issues when using glSDL with mac os x?

Also unrelated question: is there a vsync function I can call. I get some tearing in my game
and I could not find a way to vsync. I use doublebuffering, but it is not via the
SDL_DOUBLEBUFFER flag. I have a surface the size of the screen that I blit to, then send it to
the screen. Is there anything wrong with this?
thanks

I added #define HAVE_OPENGL to the top of the glSDL demo and
compiled so it would use the opengl functions. When I run I get
this error:

desired = 640 x 480
real = 640 x 480

glSDLTEST has exited due to signal 4 (SIGILL).

Did you compile glSDL.c with HAVE_OPENGL as well? Just defining it in
the application .c file won’t help… Then again, that should
result in a link error, because the glSDL_*() calls won’t be found
anywhere. They’re not compiled at all without HAVE_OPENGL.

I know that is a pretty generic error, but are there issues when
using glSDL with mac os x?

Not that I know of, but it is a proof-of-concept hack… :slight_smile:

Also unrelated question: is there a vsync function I can call.

Nope, not in SDL. There is on a few targets, but it’s of little use in
multitasking environments, because the “random” scheduling latency
prevents you from doing your job at the right time anyway.

I
get some tearing in my game and I could not find a way to vsync. I
use doublebuffering, but it is not via the SDL_DOUBLEBUFFER flag.

Then you’ll have to redesign a little. The only remotely reliable way
to get vsync is to do it through SDL_Flip(), on a “real” double
buffered display. That will (if possible) have the backend or the
driver synchronize each flip. The application will block upon trying
to lock the display surface when rendering the next frame.

I have a surface the size of the screen that I blit to, then send
it to the screen. Is there anything wrong with this?

No, but unless you’re doing alpha blending and/or “direct” software
rendering (ie pixel access), it eliminates pretty much all chances of
SDL accelerating your rendering.

If you’re using glSDL, you should never do that, because it
essentially reduces glSDL to a method of blitting a software surface
to the screen.

If you are doing alpha and stuff, you should consider something like
the “SemiTriple” buffering I’m using in Kobo Deluxe. Just set up an
SDL_DOUBLEBUF | SDL_HWSURFACE display, and if you actually get one
(do check! - this is entirely pointless if you can’t get a h/w
display surface anyway), set up your own shadow s/w surface that you
render into. That way, you get essentially what SDL gives you if you
ask for a double buffered s/w surface, except that it can still vsync
if the driver supports it.

//David Olofson - Programmer, Composer, Open Source Advocate

.- The Return of Audiality! --------------------------------.
| Free/Open Source Audio Engine for use in Games or Studio. |
| RT and off-line synth. Scripting. Sample accurate timing. |
`-----------------------------------> http://audiality.org -’
http://olofson.nethttp://www.reologica.se —On Friday 06 June 2003 10.12, Jeffrey Schnayer wrote:

Message: 16

I added #define HAVE_OPENGL to the top of the glSDL demo and compiled
so it would use
the opengl functions. When I run I get this error:

desired = 640 x 480
real = 640 x 480

glSDLTEST has exited due to signal 4 (SIGILL).

I know that is a pretty generic error, but are there issues when using
glSDL with mac os x?

Run with gdb, it will show you where you are crashing. Hopefully it’s
something obvious that’s causing the error (like a null pointer).

Also unrelated question: is there a vsync function I can call. I get
some tearing in my game
and I could not find a way to vsync. I use doublebuffering, but it is
not via the
SDL_DOUBLEBUFFER flag. I have a surface the size of the screen that I
blit to, then send it to
the screen. Is there anything wrong with this?
thanks

SDL CVS has experimental double-buffer emulation if you use
SDL_FULLSCREEN|SDL_DOUBLEBUF|SDL_HWSURFACE. It uses a real-time thread
to do its work, and has fairly good success from what I’ve seen so far.
In my testing, there was one case where tearing was visible near the
top of the screen at high resolutions, so your mileage may vary.

If you want vsync with OpenGL on Mac OS X, the driver/card can be
configured to do this. I don’t recall the exact code (check the
archives, google, etc), but it was something like:

int swapInterval = 1; // use 0 for no vbl sync
CGLContextSetParameter(CGLGetCurrentContext(), kCGLSwapInterval,
&swapInterval);

Now SDL_GL_SwapBuffers() will vsync.

HTH,
DarrellOn Friday, June 6, 2003, at 03:01 PM, sdl-request at libsdl.org wrote:

Date: Fri, 6 Jun 2003 01:12:19 -0700 (PDT)
To: sdl at libsdl.org
From: “Jeffrey Schnayer”
Subject: [SDL] glSDL + mac os x
Reply-To: sdl at libsdl.org

If you want vsync with OpenGL on Mac OS X, the driver/card can be
configured to do this. I don’t recall the exact code (check the
archives, google, etc), but it was something like:

int swapInterval = 1; // use 0 for no vbl sync
CGLContextSetParameter(CGLGetCurrentContext(), kCGLSwapInterval,
&swapInterval);

Now SDL_GL_SwapBuffers() will vsync.

We should really wrap this in a SDL_GL_SetAttribute() call.

–ryan.

If you want vsync with OpenGL on Mac OS X, the driver/card can be
configured to do this. I don’t recall the exact code (check the
archives, google, etc), but it was something like:

int swapInterval = 1; // use 0 for no vbl sync
CGLContextSetParameter(CGLGetCurrentContext(), kCGLSwapInterval,
&swapInterval);

Now SDL_GL_SwapBuffers() will vsync.

We should really wrap this in a SDL_GL_SetAttribute() call.

Patch? :slight_smile:

See ya,
-Sam Lantinga, Software Engineer, Blizzard Entertainment