Sdl_openglblit

hi`

I attached a patch for current SDL CVS that adds some additional
functionality to the OpenGL blitting part.

Okay, I guess I should explain how to use it :wink: The flag to pass to
SDL_SetVideoMode is SDL_OPENGLBLIT. It will automagically create an
OpenGL context so there is no need to specify both SDL_OPENGL and
SDL_OPENGLBLIT. There are only two modes for SDL_OPENGLBLIT that don’t
require conversion on the fly. One is the standard 565 16 bit mode and
the other is a 32 bit RGBA mode. In fact the framebuffer will be a
software surface with the appropiate color masks. Keep in mind that the
’framebuffer’ will only contain stuff that you draw on it and not any
OpenGL magic you do.

Now you can simply use SDL_UpdateRects to draw stuff on the screen. Keep
in mind that the changes won’t be visible until you call
SDL_GL_SwapBuffers. SDL_UpdateRects doesn’t destroy the order so when
you do some OpenGL stuff, use SDL_UpdateRects to draw a surface and then
use some OpenGL to draw something else it will be in the ‘middle’.

BTW, the code relies on OpenGL 1.2 for GL_UNSIGNED_SHORT_565. This is no
problem on Linux but I don’t know (and care) about which version of
OpenGL comes with MSVC…–
Daniel Vogel
Programmer
Loki Entertainment Software
-------------- next part --------------
Index: src/video/SDL_sysvideo.h

RCS file: /cvs/SDL/src/video/SDL_sysvideo.h,v
retrieving revision 1.6.2.18
diff -r1.6.2.18 SDL_sysvideo.h
172a173

int is_32bit;
Index: src/video/SDL_video.c
===================================================================
RCS file: /cvs/SDL/src/video/SDL_video.c,v
retrieving revision 1.13.2.41
diff -r1.13.2.41 SDL_video.c
596,607c596,625
< // TODO: free Surface
< SDL_VideoSurface = SDL_CreateRGBSurface(
< flags,
< width,
< height,
< 32,
< 0x000000FF,
< 0x0000FF00,
< 0x00FF0000,
< 0xFF000000
< );
< SDL_VideoSurface->flags = flags;


  if ( bpp == 16 )
  {
  	video->is_32bit = 0;
  	// TODO: free Surface
  	SDL_VideoSurface = SDL_CreateRGBSurface(
  		flags, 
  		width, 
  		height,  
  		16,
  		31 << 11,
  		63 << 5,
  		31,
  		0
  		);
  	SDL_VideoSurface->flags = flags;
  } else {
  	video->is_32bit = 1;
  	// TODO: free Surface
  	SDL_VideoSurface = SDL_CreateRGBSurface(
  		flags, 
  		width, 
  		height, 
  		32, 
  		0x000000FF,
  		0x0000FF00,
  		0x00FF0000,
  		0xFF000000
  		);
  	SDL_VideoSurface->flags = flags;
  }

658,659c676
<
< /* Set the surface completely opaque, by default */

            /* Set the surface completely opaque & white by default */

667c684
< GL_RGBA,

  	video->is_32bit ? GL_RGBA : GL_RGB,

671,672c688,689
< GL_RGBA,
< GL_UNSIGNED_BYTE,

  	video->is_32bit ? GL_RGBA : GL_RGB,
  	video->is_32bit ? GL_UNSIGNED_BYTE : GL_UNSIGNED_SHORT_5_6_5,

1063c1080
<

1072,1074c1089,1093
< GL_RGBA,
< GL_UNSIGNED_BYTE,
< this->screen->pixels + 4 * update.x + update.y * this->screen->pitch );

  			this->is_32bit? GL_RGBA : GL_RGB,
  			this->is_32bit ? GL_UNSIGNED_BYTE : GL_UNSIGNED_SHORT_5_6_5,
  			this->screen->pixels + 
  				this->screen->format->BytesPerPixel * update.x + 
  				update.y * this->screen->pitch );

1082c1101
< this->glTexCoord2f( 0.0, update.h / 256.0 );

  			this->glTexCoord2f( 0.0, update.h / 256.0 );

1107c1126
< this->glPushAttrib(GL_ALL_ATTRIB_BITS); /* TODO: narrow range of what is saved */

  this->glPushAttrib( GL_ALL_ATTRIB_BITS );	/* TODO: narrow range of what is saved */

1116c1135
< this->glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);

  this->glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE );

1122c1141
< this->glPixelStorei( GL_UNPACK_ROW_LENGTH, this->screen->pitch / 4 );

  this->glPixelStorei( GL_UNPACK_ROW_LENGTH, this->screen->pitch / this->screen->format->BytesPerPixel );

BTW, the code relies on OpenGL 1.2 for GL_UNSIGNED_SHORT_565. This is no
problem on Linux but I don’t know (and care) about which version of
OpenGL comes with MSVC…

Why not surround it with #ifdefs then. This Solaris box doesn’t have it,
and others might not either.

(argh, these diffs are hopeless to read. unified, please!)

< SDL_VideoSurface = SDL_CreateRGBSurface(
< flags,
< width,
< height,
< 32,
< 0x000000FF,
< 0x0000FF00,
< 0x00FF0000,
< 0xFF000000
< );

This is b0rken on big-endian machines. (I think it was in the original too.)
(Thanks to people on #sdl who helped me understand that the order is always
RGBA regardless of byte sex.) The rgb565 code seems fine.

Just thought I’d remention that when I tested SDL_OPENGLBLIT with VC
5.0 when it was first released it was broken there too, SDL is
supposed to be a cross platform API too, no fair throwing in things
that will be broken on all but one platform.

Wesley Poole
AKA Phoenix Kokido
Tired of hiding behind a on-line only identity…
members.xoom.com/kokido
@Wes_Poole

Mattias Engdegard wrote:

BTW, the code relies on OpenGL 1.2 for GL_UNSIGNED_SHORT_565. This is no
problem on Linux but I don’t know (and care) about which version of
OpenGL comes with MSVC…

Why not surround it with #ifdefs then. This Solaris box doesn’t have it,
and others might not either.

(argh, these diffs are hopeless to read. unified, please!)

< SDL_VideoSurface = SDL_CreateRGBSurface(
< flags,
< width,
< height,
< 32,
< 0x000000FF,
< 0x0000FF00,
< 0x00FF0000,
< 0xFF000000
< );

This is b0rken on big-endian machines. (I think it was in the original
too.)> (Thanks to people on #sdl who helped me understand that the order is always
RGBA regardless of byte sex.) The rgb565 code seems fine.

Mattias Engdeg?rd wrote:

Why not surround it with #ifdefs then. This Solaris box doesn’t have it,
and others might not either.

Well, it is surrounded with #ifdef OPENGL :wink: Actually I could query for
the version of the OpenGL implementation and stuff like that and then
decide before the first use whether I want to set the is_32bit flag or
not. OpenGL 1.2 is only used in the 16bpp path. Well, that will have to
wait some days, but you are right, it should detect when the OpenGL
implementation is too old and then barf and use the slow approach.

(argh, these diffs are hopeless to read. unified, please!)

Sorry, I had to fine tune the diff as I got some other changes to SDL
here so the non unified diff was the easiest to use. This (plus some
ifdefs) will go into CVS soon so there is no real need to read this diff
:wink: I provided it just so people can fiddle with it if they are
interested.

< SDL_VideoSurface = SDL_CreateRGBSurface(
< flags,
< width,
< height,
< 32,
< 0x000000FF,
< 0x0000FF00,
< 0x00FF0000,
< 0xFF000000
< );

This is b0rken on big-endian machines. (I think it was in the original too.)
(Thanks to people on #sdl who helped me understand that the order is always
RGBA regardless of byte sex.) The rgb565 code seems fine.

Yes, there should be an ifdef - thanks for reminding me :-)–
Daniel Vogel
Programmer
Loki Entertainment Software

Just thought I’d remention that when I tested SDL_OPENGLBLIT with VC
5.0 when it was first released it was broken there too, SDL is
supposed to be a cross platform API too, no fair throwing in things
that will be broken on all but one platform.

BTW, the CVS versions will always be broken on some platforms.
I try to test all releases on all platforms before I release them, but
sometimes things get through. The 1.2 stable branch will be fully tested
before being released.

Oh, and regarding the blitting on Win32, the code is specifically designed
so that it will work on all platforms in theory. In practice, I’m accepting
patches. :slight_smile:

See ya!
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

Hello everybody, this is my first post here and i’m very new to SDL, but
not with OpenGL.

I’m trying to work using blit for rendering 3d with OpenGl and 2d with SDL
in the same screen … but i find a problem

I open Sdl with SDL_OPENGLBLIT flag in SetVideMode, but after OpenGl rendering
that comes well,i don’t be able to render a bmp into screen surface …
this is code

SLD_Surface *screen;

{


screen=SDL_SetVideoMode(640, 480, 16, SDL_OPENGLBLIT);

// main loop
{ DrawGLScene();
Render2d();
… // key control

}

quit();
}

void Render2d()
{
SDL_Rect *rects;
rects->x=100;
rects->y=100;
rects->w=bmp->w;
rects->h=bmp->h;
// screen=SDL_GetVideoSurface(); I try with this line
SDL_BlitSurface(bmp,NULL,screen,rects);
SDL_UpdateRects(screen,1 , rects);
}

Excuse for bad english,i’m italian
See you__________________________________________________________________
Abbonati a Tiscali!
Con Tiscali By Phone puoi anche ascoltare ed inviare email al telefono.
Chiama Tiscali By Phone all’ 892 800 http://byphone.tiscali.it

Unfortunately I can’t help you with your problem, but I can give you some
advise:
Don’t use SDL_OPENGLBLIT for 2D OpenGL rendering. Even if you can fix your
problem,
it is not supposed to be fast.

The generally preferred way to draw 2D stuff to OpenGL is to draw textured
quads
in glOrtho / gluOrtho2D mode. Suppose your screen is width x height, then
this code
should allow for pixel perfect “blitting” in OpenGL:

/* Set 2D matrices */
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glLoadIdentity();
glOrtho(0, width, 0, height, -1, 1);

glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadIdentity();

/* Example of some 2D rendering

  • Draws a fullscreen textured quad
    */
    glBindTexture(GL_TEXTURE_2D, hudTextureID);
    glBegin(GL_QUADS);
    glTexCoord2f(0,1)
    glVertex2i(0,height);
    glTexCoord2f(0,0)
    glVertex2i(0,0);
    glTexCoord2f(1,0)
    glVertex2i(width,0);
    glTexCoord2f(1,1)
    glVertex2i(width,height);

glEnd();

/* Restore 3D matrices */
glMatrixMode(GL_PROJECTION);
glPopMatrix();
glMatrixMode(GL_MODELVIEW);
glPopMatrix();

And an additional tip: Those width and height variables don’t actually have
to match your
OpenGL window’s resolution. That’s a handy thing to support multiple
resolutions. For instance
you could set glOrtho to 640x480 even though the window is 800x600. All the
rendering will get
enlarge automatically by OpenGL.

Regards,

Dirk Gerrits> ----- Original Message -----

From: lobo666@tiscali.it ()
To:
Sent: Thursday, 14 February, 2002 12:13
Subject: [SDL] SDL_OPENGLBLIT

Hello everybody, this is my first post here and i’m very new to SDL, but
not with OpenGL.

I’m trying to work using blit for rendering 3d with OpenGl and 2d with SDL
in the same screen … but i find a problem

I open Sdl with SDL_OPENGLBLIT flag in SetVideMode, but after OpenGl
rendering
that comes well,i don’t be able to render a bmp into screen surface …
this is code

SLD_Surface *screen;

{


screen=SDL_SetVideoMode(640, 480, 16, SDL_OPENGLBLIT);

// main loop
{ DrawGLScene();
Render2d();
… // key control

}

quit();
}

void Render2d()
{
SDL_Rect *rects;
rects->x=100;
rects->y=100;
rects->w=bmp->w;
rects->h=bmp->h;
// screen=SDL_GetVideoSurface(); I try with this line
SDL_BlitSurface(bmp,NULL,screen,rects);
SDL_UpdateRects(screen,1 , rects);
}

Excuse for bad english,i’m italian
See you


Abbonati a Tiscali!
Con Tiscali By Phone puoi anche ascoltare ed inviare email al telefono.
Chiama Tiscali By Phone all’ 892 800 http://byphone.tiscali.it


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Hi!
I read the previous solution to your problem, and I gotta say that this one
is not very intelligent.
Try this one:

// render 3d …
glRasterPos2f(rx,ry); // you gotta try which rx,ry are best
glDrawPixels(draw your bitmap);

This is a much easier version, isn’t it?
St0fF 64

At 12:13 14.02.2002 +0100, you wrote:>Hello everybody, this is my first post here and i’m very new to SDL, but

not with OpenGL.

I’m trying to work using blit for rendering 3d with OpenGl and 2d with SDL
in the same screen … but i find a problem

I open Sdl with SDL_OPENGLBLIT flag in SetVideMode, but after OpenGl rendering
that comes well,i don’t be able to render a bmp into screen surface …
this is code

SLD_Surface *screen;

{


screen=SDL_SetVideoMode(640, 480, 16, SDL_OPENGLBLIT);

// main loop
{ DrawGLScene();
Render2d();
… // key control

}

quit();
}

void Render2d()
{
SDL_Rect *rects;
rects->x=100;
rects->y=100;
rects->w=bmp->w;
rects->h=bmp->h;
// screen=SDL_GetVideoSurface(); I try with this line
SDL_BlitSurface(bmp,NULL,screen,rects);
SDL_UpdateRects(screen,1 , rects);
}

Excuse for bad english,i’m italian
See you


Abbonati a Tiscali!
Con Tiscali By Phone puoi anche ascoltare ed inviare email al telefono.
Chiama Tiscali By Phone all’ 892 800 http://byphone.tiscali.it


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

glDrawpixels is really slow. Best way is to set a 2d OpenGL projection and
draw it as a GL_QUAD (or you could use a tri strip)

glViewport(xpos_2d, ypos_2d, xsize_2d, ysize_2d);
glEnable(GL_TEXTURE_2D);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0, (GLdouble)xsize_2d, (GLdouble)ysize_2d, 0.0, -99999, 99999);

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

glDisable(GL_DEPTH_TEST);
glDisable(GL_CULL_FACE);
glDisable(GL_SCISSOR_TEST);
glDisable(GL_LIGHTING);
glDisable(GL_FOG);

NOW: (using screen coordinates):

glBegin(GL_QUADS);
glTexCoord2f(0.0f, 1.0f);
glVertex2f(xf, yf);
glTexCoord2f(1.0f, 1.0f);
glVertex2f(xf + w, yf);
glTexCoord2f(1.0f, 0.0f);
glVertex2f(xf + w, yf + h);
glTexCoord2f(0.0f, 0.0f);
glVertex2f(xf, yf + h);
glEnd();

Robin.> ----- Original Message -----

From: st0ff@gmx.net (Stefan Hubner)
To:
Sent: Sunday, February 17, 2002 6:54 PM
Subject: Re: [SDL] SDL_OPENGLBLIT

Hi!
I read the previous solution to your problem, and I gotta say that this
one
is not very intelligent.
Try this one:

// render 3d …
glRasterPos2f(rx,ry); // you gotta try which rx,ry are best
glDrawPixels(draw your bitmap);

This is a much easier version, isn’t it?
St0fF 64

At 12:13 14.02.2002 +0100, you wrote:

Hello everybody, this is my first post here and i’m very new to SDL, but
not with OpenGL.

I’m trying to work using blit for rendering 3d with OpenGl and 2d with
SDL

in the same screen … but i find a problem

I open Sdl with SDL_OPENGLBLIT flag in SetVideMode, but after OpenGl
rendering

that comes well,i don’t be able to render a bmp into screen surface …
this is code

SLD_Surface *screen;

{


screen=SDL_SetVideoMode(640, 480, 16, SDL_OPENGLBLIT);

// main loop
{ DrawGLScene();
Render2d();
… // key control

}

quit();
}

void Render2d()
{
SDL_Rect *rects;
rects->x=100;
rects->y=100;
rects->w=bmp->w;
rects->h=bmp->h;
// screen=SDL_GetVideoSurface(); I try with this line
SDL_BlitSurface(bmp,NULL,screen,rects);
SDL_UpdateRects(screen,1 , rects);
}

Excuse for bad english,i’m italian
See you


Abbonati a Tiscali!
Con Tiscali By Phone puoi anche ascoltare ed inviare email al telefono.
Chiama Tiscali By Phone all’ 892 800 http://byphone.tiscali.it


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

I saw on NeHe an example where SDL_OPENGLBLIT is being used. The sample
showed drawing an OpenGL scene and then using SDL to blit on top of it.
Here’s the link:
http://nehe.gamedev.net/counter.asp?file=files/basecode/nehegl_sdl.zip

You probably don’t need to look at the code to answer the question, but
the code itself is very short anyways.

Here’s the question: Is hardware acceleration being used to draw 2D? If
so, then all drawing in the example is being done in 2D? And if that is
the case, then why would one bother to map textures in OpenGL just to
draw 2D when you can create an OpenGL surface and just SDL blit on it?

The use of SDL_OPENGLBLIT is deprecated, for quite a few reasons.

Many of those reasons can be found at:
http://twomix.devolution.com/pipermail/sdl/2001-November/039852.html
the reply to the posting about the NeHe/SDL Basecode port announcement on this
list.

However, one this has been left out (which I haven’t actually tested, but I’ve
heard it a time or two before from reliable sources): under some (all?)
circumstances, using SDL_OPENGLBLIT can corrupt your OpenGL state (the NeHe
Basecode isn’t complex enough for this to become apparent, however).

  • AlexOn Monday 11 October 2004 12:10 am, apocalypznow wrote:

I saw on NeHe an example where SDL_OPENGLBLIT is being used. The sample
showed drawing an OpenGL scene and then using SDL to blit on top of it.
Here’s the link:
http://nehe.gamedev.net/counter.asp?file=files/basecode/nehegl_sdl.zip

You probably don’t need to look at the code to answer the question, but
the code itself is very short anyways.

Here’s the question: Is hardware acceleration being used to draw 2D? If
so, then all drawing in the example is being done in 2D? And if that is
the case, then why would one bother to map textures in OpenGL just to
draw 2D when you can create an OpenGL surface and just SDL blit on it?

John Silicon wrote:>On Monday 11 October 2004 12:10 am, apocalypznow wrote:

I saw on NeHe an example where SDL_OPENGLBLIT is being used. The sample
showed drawing an OpenGL scene and then using SDL to blit on top of it.
Here’s the link:
http://nehe.gamedev.net/counter.asp?file=files/basecode/nehegl_sdl.zip

You probably don’t need to look at the code to answer the question, but
the code itself is very short anyways.

Here’s the question: Is hardware acceleration being used to draw 2D? If
so, then all drawing in the example is being done in 2D? And if that is
the case, then why would one bother to map textures in OpenGL just to
draw 2D when you can create an OpenGL surface and just SDL blit on it?

The use of SDL_OPENGLBLIT is deprecated, for quite a few reasons.

Many of those reasons can be found at:
http://twomix.devolution.com/pipermail/sdl/2001-November/039852.html
the reply to the posting about the NeHe/SDL Basecode port announcement on this
list.

However, one this has been left out (which I haven’t actually tested, but I’ve
heard it a time or two before from reliable sources): under some (all?)
circumstances, using SDL_OPENGLBLIT can corrupt your OpenGL state (the NeHe
Basecode isn’t complex enough for this to become apparent, however).

Just for the sake of example : a vertex/fragment program will interfere.

Stephane