Does OpenGL care about color depth of surfaces? plus more Q

hi,

first, id like to get the simple thing out of the way. when i was using SDL for
drawing, i would convert the surface to the pixel format of the screen right
away, so SDL wouldnt do it for me on the fly… now that im using OpenGL, do i
have to convert my images to the screen surface, before loading it into an
OpenGL texture? or does it not matter at this point?

second, ive recently had a problem with SDL and OpenGL. first, its a real PITA
to make sure each SDL_Surface is in the proper format before loading it in as a
texture for OpenGL. i finnally found a good system, which works on any image,
but theres on thing thats bothering me. this is my routine for loading in an
image to be turned into a texture:

load in the image via IMG_Load()

convert this surface to another via SDL_DisplayFormatAlpha()

call SDL_SetAlpha(img,0,0), where img was the surface i have so far

create ANOTHER surface, via SDL_CreateRGBSurface().

blit the image i loaded onto this newly created surface.

now the my surface is in RGB order, and still has the alpha channel!! awesome!
(it took me a long time to figure this all out =))

this works great! there is just one problem - the call to SDL_CreateRGBSurface()

for the bpp parameter, i MUST specify 32, or else it crashes!!! if i try putting
16 as the parameter, it will crash! does anyone know why? thanks for any help!!!

SDL_Create

It might have something to do with SDL not knowing how to cram an alpha
channel into a 16bpp surface. It is easy for 32bpp because there is just
an extra 8 bits laying around but usually 16bpp is a 5:6:5 rgb only format
(or 5:5:5 with an alpha mask bit) unless your video card supports
something like 4:4:4 then you have an extra 4 bits for alpha; I dont know
what all sopport SDL has for these formats though.

  • Will> hi,

first, id like to get the simple thing out of the way. when i was using
SDL for drawing, i would convert the surface to the pixel format of the
screen right away, so SDL wouldnt do it for me on the fly… now that
im using OpenGL, do i have to convert my images to the screen surface,
before loading it into an OpenGL texture? or does it not matter at this
point?

second, ive recently had a problem with SDL and OpenGL. first, its a
real PITA to make sure each SDL_Surface is in the proper format before
loading it in as a texture for OpenGL. i finnally found a good system,
which works on any image, but theres on thing thats bothering me. this
is my routine for loading in an image to be turned into a texture:

load in the image via IMG_Load()

convert this surface to another via SDL_DisplayFormatAlpha()

call SDL_SetAlpha(img,0,0), where img was the surface i have so far

create ANOTHER surface, via SDL_CreateRGBSurface().

blit the image i loaded onto this newly created surface.

now the my surface is in RGB order, and still has the alpha channel!!
awesome! (it took me a long time to figure this all out =))

this works great! there is just one problem - the call to
SDL_CreateRGBSurface()

for the bpp parameter, i MUST specify 32, or else it crashes!!! if i
try putting 16 as the parameter, it will crash! does anyone know why?
thanks for any help!!!

SDL_Create


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

thanks, but heres the weird thing. if i set the screen, loaded the textures
and
created the RGBSurface() sending the BPP parameters as
SDL_GetVideoInfo()->vfmt->BitsPerPixel;

it worked! even if i set my screen bpp to 16. but if i manually put 16 in
there,
it crashes! even weirder, if i do

SDL_VideoInfo vi = SDL_GetVideoInfo();

cout << vi->vfmt->BitsPerPixel <<endl;

it prints a blank character to the screen! so weird… i tried doing this
cout in
many different spots, and got the same blank space, yet it sets up the
screen
perfectly ? it deosnt make sence… and BytesPerPixel would print out a
strange
character… any idea? thanks for anymore help@@@>

It might have something to do with SDL not knowing how to cram an alpha
channel into a 16bpp surface. It is easy for 32bpp because there is just
an extra 8 bits laying around but usually 16bpp is a 5:6:5 rgb only format
(or 5:5:5 with an alpha mask bit) unless your video card supports
something like 4:4:4 then you have an extra 4 bits for alpha; I dont know
what all sopport SDL has for these formats though.

  • Will

hi,

first, id like to get the simple thing out of the way. when i was using
SDL for drawing, i would convert the surface to the pixel format of the
screen right away, so SDL wouldnt do it for me on the fly… now that
im using OpenGL, do i have to convert my images to the screen surface,
before loading it into an OpenGL texture? or does it not matter at this
point?

second, ive recently had a problem with SDL and OpenGL. first, its a
real PITA to make sure each SDL_Surface is in the proper format before
loading it in as a texture for OpenGL. i finnally found a good system,
which works on any image, but theres on thing thats bothering me. this
is my routine for loading in an image to be turned into a texture:

load in the image via IMG_Load()

convert this surface to another via SDL_DisplayFormatAlpha()

call SDL_SetAlpha(img,0,0), where img was the surface i have so far

create ANOTHER surface, via SDL_CreateRGBSurface().

blit the image i loaded onto this newly created surface.

now the my surface is in RGB order, and still has the alpha channel!!
awesome! (it took me a long time to figure this all out =))

this works great! there is just one problem - the call to
SDL_CreateRGBSurface()

for the bpp parameter, i MUST specify 32, or else it crashes!!! if i
try putting 16 as the parameter, it will crash! does anyone know why?
thanks for any help!!!

SDL_Create


SDL mailing list
SDL libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


FREE pop-up blocking with the new MSN Toolbar ? get it now!
http://toolbar.msn.click-url.com/go/onm00200415ave/direct/01/

Quoth Graveyard Filla , on 2004-07-13 02:41:58 +0000:

even weirder, if i do

SDL_VideoInfo vi = SDL_GetVideoInfo();

cout << vi->vfmt->BitsPerPixel <<endl;

it prints a blank character to the screen! so weird…

According to the man page for SDL_PixelFormat, the BitsPerPixel element
is a Uint8 – an unsigned 8-bit integer. This is usually equivalent to
an unsigned char, and hence the << operator overload is interpreting it
as a character rather than an integer. ASCII character 32, of course,
is space. Adding an explicit cast to int or similar should prevent
that.

—> Drake Wilson
-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: Digital signature
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20040712/a9b16ab2/attachment.pgp

Probably printing a blank character because spacebar is 32 in ascii which
would correspond to 32bpp in your case, the reason your cout is probably
outputting ascii chars rather than numbers is because BitsPerPixel
probably returns an unsigned char or char, just cast it to an int.Have you checked the bpp and surface format flags that GetVideoInfo is
returning?

  • Will> thanks, but heres the weird thing. if i set the screen, loaded the

textures and
created the RGBSurface() sending the BPP parameters as
SDL_GetVideoInfo()->vfmt->BitsPerPixel;

it worked! even if i set my screen bpp to 16. but if i manually put 16
in there,
it crashes! even weirder, if i do

SDL_VideoInfo vi = SDL_GetVideoInfo();

cout << vi->vfmt->BitsPerPixel <<endl;

it prints a blank character to the screen! so weird… i tried doing
this cout in
many different spots, and got the same blank space, yet it sets up the
screen
perfectly ? it deosnt make sence… and BytesPerPixel would print out a
strange
character… any idea? thanks for anymore help@@@

It might have something to do with SDL not knowing how to cram an alpha
channel into a 16bpp surface. It is easy for 32bpp because there is
just an extra 8 bits laying around but usually 16bpp is a 5:6:5 rgb
only format (or 5:5:5 with an alpha mask bit) unless your video card
supports something like 4:4:4 then you have an extra 4 bits for alpha;
I dont know what all sopport SDL has for these formats though.

  • Will

hi,

first, id like to get the simple thing out of the way. when i was
using SDL for drawing, i would convert the surface to the pixel
format of the screen right away, so SDL wouldnt do it for me on the
fly… now that im using OpenGL, do i have to convert my images to
the screen surface, before loading it into an OpenGL texture? or
does it not matter at this point?

second, ive recently had a problem with SDL and OpenGL. first, its a
real PITA to make sure each SDL_Surface is in the proper format
before loading it in as a texture for OpenGL. i finnally found a
good system, which works on any image, but theres on thing thats
bothering me. this is my routine for loading in an image to be
turned into a texture:

load in the image via IMG_Load()

convert this surface to another via SDL_DisplayFormatAlpha()

call SDL_SetAlpha(img,0,0), where img was the surface i have so far

create ANOTHER surface, via SDL_CreateRGBSurface().

blit the image i loaded onto this newly created surface.

now the my surface is in RGB order, and still has the alpha
channel!! awesome! (it took me a long time to figure this all out
=))

this works great! there is just one problem - the call to
SDL_CreateRGBSurface()

for the bpp parameter, i MUST specify 32, or else it crashes!!! if i
try putting 16 as the parameter, it will crash! does anyone know
why? thanks for any help!!!

SDL_Create


SDL mailing list
SDL libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


FREE pop-up blocking with the new MSN Toolbar ? get it now!
http://toolbar.msn.click-url.com/go/onm00200415ave/direct/01/


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Quoth Drew Ferraro , on 2004-07-13 07:07:08 +0000:

first, i want to apologize that this isnt direct reply to my post. for some
reason, when i click “follow up”, and try to post hte message, it gives an error
"you are top posting, dont do that". i tried for about 5 minutes trying to
figure out how to reply, but it just wouldnt let me…

If you are in fact placing your text above the quoted text, this is
often considered poor style; see
http://catb.org/jargon/html/email-style.html. It breaks up the flow
of discussion, since replies and original messages no longer read in a
natural order and must be mentally rearranged by the reader.

thanks guys, but does anyone have any clue why my program crashes when i call

SDL_Surface *img2 =
SDL_CreateRGBSurface(SDL_SWSURFACE,img->w,img->h,video_info->vfmt->BitsPerPixel,
0x000000ff, 0x0000ff00, 0x00ff0000, 0xff000000);

If video_info->vfmt->BitsPerPixel is 16, then you’re supplying masks
that contradict the given bpp value. 0xff000000 will not fit into a
16-bit space. I imagine it may be possible to use

SDL_CreateRGBSurface(SDL_SWSURFACE, width, height,
16, 0x000f, 0x00f0, 0x0f00, 0xf000)

instead, but I haven’t tested this; this would correspond to the format
GL_UNSIGNED_SHORT_4_4_4_4 or GL_UNSIGNED_SHORT_4_4_4_4_REV in the call
to glTexImage2D. (Note that these formats are only available for OpenGL
versions >= 1.2, according to the relevant man page.)

Secondly, OpenGL will, IINM, convert the data as necessary when
uploading and/or using textures. glTexImage2D takes the existing format
of the pixel data as part of its parameter list, and a suggested
internal format; the latter may be overridden or mutated as necessary by
the given implementation. Creating a texture with a 32-bit format when
the framebuffer is only 16 bits wide should be no problem; I do it
regularly, and have had no problem with it, though I can’t vouch for
other platforms.

—> Drake Wilson
-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: Digital signature
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20040713/6eac4663/attachment.pgp

thanks Drake. ill just put 32 in there then. thanks again!