SDL_ttf font -> gl texture

Hi all,

Sorry for the newbie question, but I was hoping someone could show me how
to create OpenGL texures from the SDL_Surface returned by TTF_RenderText…

I tried to make it by:
glTexImage2D(GL_TEXTURE_2D, 0, 3, text->w, text-h, 0, GL_RGB,
GL_UNSIGNED_BYTE, text->pixels);

but all i get is garbage. I know GL textures have to be powers of 2 in
width/height and the SDL_Surface (text) isn’t, but I’m clueless as to what
I should do.

Does anyone know what I should be doing?

Thanks,
dave

the matter is that the pixel format of the surface created by SDL is not
necessarily RGB in that order…

you must check and eventually convert the sdl surface to the correct
color pixel format before passing it to the gl texturing engine> -----Message d’origine-----

De : sdl-admin at libsdl.org [mailto:sdl-admin at libsdl.org]De la part de
david morrison
Envoye : dimanche 28 octobre 2001 11:39
A : sdl at libsdl.org
Objet : [SDL] SDL_ttf font -> gl texture

Hi all,

Sorry for the newbie question, but I was hoping someone could show me how
to create OpenGL texures from the SDL_Surface returned by TTF_RenderText…

I tried to make it by:
glTexImage2D(GL_TEXTURE_2D, 0, 3, text->w, text-h, 0, GL_RGB,
GL_UNSIGNED_BYTE, text->pixels);

but all i get is garbage. I know GL textures have to be powers of 2 in
width/height and the SDL_Surface (text) isn’t, but I’m clueless
as to what
I should do.

Does anyone know what I should be doing?

Thanks,
dave


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

There are almost no image formats not supported by OpenGL 1.2 though, so
it doesn’t make sense to convert everything not in one particular format.
You can get OpenGL 1.2 headers for win32 easily enough and if you have Q3A
you already have OpenGL 1.2 drivers.

My suggestion is to handle every format you expect you’ll see yourself
using the correct OpenGL texture formats. Anything else, convert it for
no other reason than some drivers are broken when using uncommon formats.On Sun, Oct 28, 2001 at 08:10:08PM +0100, Damien Mascre wrote:

the matter is that the pixel format of the surface created by SDL is not
necessarily RGB in that order…

you must check and eventually convert the sdl surface to the correct
color pixel format before passing it to the gl texturing engine


Joseph Carter Free software developer

There Is No Cabal.

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20011028/3e01735a/attachment.pgp

The easiest way to do this would be to software blit the image to a
surface of appropriate size and bits per pixel and then pass it to
openGL.

int n_w;
int n_h;

double l2_w = log10(surfacename->w) / log10(2);
double l2_h = log10(surfacename->h) / log10(2);

if (l2_w != floor(l2_w)) {n_w = 1 << (int)ceil(l2_w);}
if (l2_h != floor(l2_h)) {n_h = 1 << (int)ceil(l2_h);}

SDL_Surface* newimage;
newimage = SDL_AllocSurface(SDL_SWSURFACE,
n_w,n_h,32,Rmask,Gmask,Bmask,Amask);

SDL_BlitSurface(surfacename, NULL , newimage, NULL);

glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, newimage->w,

newimage->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, newimage->pixels);> ----- Original Message -----

From: sdl-admin@libsdl.org [mailto:sdl-admin at libsdl.org] On Behalf Of
david morrison
Sent: Sunday, October 28, 2001 2:39 AM
To: sdl at libsdl.org
Subject: [SDL] SDL_ttf font -> gl texture

Hi all,

Sorry for the newbie question, but I was hoping someone could show me
how
to create OpenGL texures from the SDL_Surface returned by
TTF_RenderText…

I tried to make it by:
glTexImage2D(GL_TEXTURE_2D, 0, 3, text->w, text-h, 0, GL_RGB,
GL_UNSIGNED_BYTE, text->pixels);

but all i get is garbage. I know GL textures have to be powers of 2 in
width/height and the SDL_Surface (text) isn’t, but I’m clueless as to
what
I should do.

Does anyone know what I should be doing?

Thanks,
dave


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

the matter is that the pixel format of the surface created by SDL is not
necessarily RGB in that order…

you must check and eventually convert the sdl surface to the correct
color pixel format before passing it to the gl texturing engine

There are almost no image formats not supported by OpenGL 1.2 though, so
it doesn’t make sense to convert everything not in one particular format.
You can get OpenGL 1.2 headers for win32 easily enough and if you have Q3A
you already have OpenGL 1.2 drivers.

My suggestion is to handle every format you expect you’ll see yourself
using the correct OpenGL texture formats. Anything else, convert it for
no other reason than some drivers are broken when using uncommon formats.

However, SDL_ttf returns ARGB pixels (WHY???). OGL 1.2 does not
support this format. So the SDL_Surface returned by TTF_RenderText*
must be converted, as well as SDL_BlitSurface()'d onto another surface
(granted you can combine these two steps) that has dimensions which are
powers of two. Then a GL texture can be created and used.

To the maintainer of SDL_ttf, would you consider adding a function to
render fonts directly to an SDL_Surface passed as an argument? This would
simplify using SDL_ttf with OGL, and even if it only performed the above
steps internally for now, it could eventually be optimized to remove all
the redundant software blitting.

Jp CalderoneOn Sun, 28 Oct 2001, Joseph Carter wrote:

On Sun, Oct 28, 2001 at 08:10:08PM +0100, Damien Mascre wrote:

the matter is that the pixel format of the surface created by
SDL is not

necessarily RGB in that order…

you must check and eventually convert the sdl surface to the correct
color pixel format before passing it to the gl texturing engine

There are almost no image formats not supported by OpenGL 1.2 though, so
it doesn’t make sense to convert everything not in one
particular format.
You can get OpenGL 1.2 headers for win32 easily enough and if
you have Q3A
you already have OpenGL 1.2 drivers.

My suggestion is to handle every format you expect you’ll see yourself
using the correct OpenGL texture formats. Anything else, convert it for
no other reason than some drivers are broken when using
uncommon formats.

However, SDL_ttf returns ARGB pixels (WHY???). OGL 1.2 does not

alpha channel is for transparency/anti-aliasing concerns…
and it is very useful…

support this format. So the SDL_Surface returned by TTF_RenderText*
must be converted, as well as SDL_BlitSurface()'d onto another surface
(granted you can combine these two steps) that has dimensions which are
powers of two. Then a GL texture can be created and used.

I am not an open gl guru… but i am not sure about open gl accepting
every pixel format… excepted raw RGB data, in native form afaik
The step of reading image file format is delegued to upper-level programming
interface, such as sdl_image, but not at open gl level… correct me if i am
wrong

To the maintainer of SDL_ttf, would you consider adding a function to
render fonts directly to an SDL_Surface passed as an argument? This would
simplify using SDL_ttf with OGL, and even if it only performed the above
steps internally for now, it could eventually be optimized to remove all
the redundant software blitting.

i agree, even for sdl-only softwares, sdl_ttf fonts must often be cached in
sdl_surface, using lot of memory, because you need speed in rendering
your screens, and it is more critical when you cannot pre-render the fonts
before
redrawing your frame.> On Sun, 28 Oct 2001, Joseph Carter wrote:

On Sun, Oct 28, 2001 at 08:10:08PM +0100, Damien Mascre wrote:

However, SDL_ttf returns ARGB pixels (WHY???). OGL 1.2 does not
support this format. So the SDL_Surface returned by TTF_RenderText*
must be converted, as well as SDL_BlitSurface()'d onto another surface
(granted you can combine these two steps) that has dimensions which are
powers of two. Then a GL texture can be created and used.

Ugh. You’re right, it suppurts everything else, but expects the channels
to be in traditional order - RGBA or ABGR. ARGB is not supported.

As for the texture formats OpenGL does support as of 1.2, here’s the long
list:

GL_ALPHA4 GL_ALPHA8 GL_ALPHA12 GL_ALPHA16
GL_LUMINANCE4 GL_LUMINANCE8 GL_LUMINANCE12 GL_LUMINANCE16
GL_LUMINANCE4_ALPHA4 GL_LUMINANCE6_ALPHA2 GL_LUMINANCE8_ALPHA8
GL_LUMINANCE12_ALPHA4 GL_LUMINANCE12_ALPHA12 GL_LUMINANCE16_ALPHA16
GL_INTENSITY GL_INTENSITY4 GL_INTENSITY8 GL_INTENSITY12 GL_INTENSITY16
GL_R3_G3_B2 GL_RGB4 GL_RGB5 GL_RGB8 GL_RGB10 GL_RGB12 GL_RGB16
GL_RGBA2 GL_RGBA4 GL_RGB5_A1 GL_RGBA8 GL_RGB10_A2 GL_RGBA12 GL_RGBA16
GL_BGR GL_BGRA GL_UNSIGNED_BYTE_3_3_2 GL_UNSIGNED_BYTE_2_3_3_REV
GL_UNSIGNED_SHORT_5_6_5 GL_UNSIGNED_SHORT_5_6_5_REV
GL_UNSIGNED_SHORT_4_4_4_4 GL_UNSIGNED_SHORT_4_4_4_4_REV
GL_UNSIGNED_SHORT_5_5_5_1 GL_UNSIGNED_SHORT_1_5_5_5_REV
GL_UNSIGNED_INT_8_8_8_8 GL_UNSIGNED_INT_8_8_8_8_REV
GL_UNSIGNED_INT_10_10_10_2 GL_UNSIGNED_INT_2_10_10_10_REV

This is just about every single useful format I can possibly imagine,
except for the indexed formats which are technically extensions even if
everyone supports them in some format or another. I think it’s probably
not a very good idea for SDL_ttf to send back ARGB since there is no
reason to return that over ABGR or RGBA really and while both of the
latter are supported, the former is really an odd format. The ones that
were added by OpenGL 1.2 were the verbose GL_UNSIGNED_* at the end, the
rest all work in 1.1 just fine.

I think this also proves my point that you can’t reasonably handle EVERY
single format SDL may throw at you with one of the above. Hence my
conclusion: When uploading SDL surfaces to OpenGL, check the pixel
formats. If you recognize them, upload it. If you don’t, create a
surface you will recognize, blit to it, and upload that.

To the maintainer of SDL_ttf, would you consider adding a function to
render fonts directly to an SDL_Surface passed as an argument? This would
simplify using SDL_ttf with OGL, and even if it only performed the above
steps internally for now, it could eventually be optimized to remove all
the redundant software blitting.

That’d complicate the code a bit I think. But if it’s giving you ARGB
back, the current format is not very OpenGL-friendly at all.On Sun, Oct 28, 2001 at 04:40:35PM -0500, Jp Calderone wrote:


Joseph Carter Free software developer

  • knghtbrd is gone - zzz - messages will be snapped like wet towels at all
    of the people who have stolen the trademark knghtbrd away message
    ack
  • Coderjoe prepares to defend himself from wet messages

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20011028/ca62f78b/attachment.pgp

Jp Calderone wrote:

However, SDL_ttf returns ARGB pixels (WHY???).

It has to pick one format, and ARGB is as good as any (actually, it’s better
than most).

Now to clear up some confusion about what the terms “ARGB”, “BGRA” etc mean.
Two interpretations are possible:

  1. ARGB means that a pixel is stored as an integer with alpha in the top bits,
    red in the next lower bits, then green, and blue in the lowest bits.
  2. ARGB means that a pixel is stored as four bytes: first alpha, then red,
    then green, and last blue.

Of course, these mean the same thing only on big-endian machines.
I usually favour #1, since a) the terminology works for smaller pixels
like 16bpp, b) it’s how SDL works (such as the masks passed to
CreateRGBSurface)

Confusingly enough, OpenGL uses interpretation #2 in its
documentation. And in the SDL docs, we sometimes talk about “RGBA” or
"RGB" to mean surfaces with/without alpha channel, not caring in which
order the channels come.

To the maintainer of SDL_ttf, would you consider adding a function to
render fonts directly to an SDL_Surface passed as an argument? This would
simplify using SDL_ttf with OGL, and even if it only performed the above
steps internally for now, it could eventually be optimized to remove all
the redundant software blitting.

why, have you identified it as a bottleneck in your code? I would be very
surprised if caching your rendered strings or glyphs wouldn’t help

Jp Calderone <@Jp_Calderone> wrote:

However, SDL_ttf returns ARGB pixels (WHY???).

It has to pick one format, and ARGB is as good as any (actually, it’s better
than most).

Quite true that it has to pick one format - unless it renders to a given
SDL_Surface, in which case things will end up in whatever format the user
likes, probably at some performance penalty though - but I don’t follow
how you’ve decided that ARGB is as good or better than most. I have
always taken pixel format strings such as these to give the contents and
the order of the pixels, and this is the order I have always seen them
presented in. My experience in this area is probably more limited than
that of many on this list though, so if you could elaborate as to why
ARGB is superior to other formats, perhaps that would help my
understanding of things. Presently, I can only tell that this format
makes it difficult to use SDL_ttf output with OGL.

Now to clear up some confusion about what the terms “ARGB”, “BGRA” etc mean.
Two interpretations are possible:

  1. ARGB means that a pixel is stored as an integer with alpha in the top bits,
    red in the next lower bits, then green, and blue in the lowest bits.
  2. ARGB means that a pixel is stored as four bytes: first alpha, then red,
    then green, and last blue.

Of course, these mean the same thing only on big-endian machines.
I usually favour #1, since a) the terminology works for smaller pixels
like 16bpp, b) it’s how SDL works (such as the masks passed to
CreateRGBSurface)

Confusingly enough, OpenGL uses interpretation #2 in its
documentation. And in the SDL docs, we sometimes talk about “RGBA” or
"RGB" to mean surfaces with/without alpha channel, not caring in which
order the channels come.

OpenGL doesn’t use exactly the second interpretation. If you use BYTE
or UNSIGNED_BYTE as the type it does. But OpenGL provides other options,
SHORT, INT, FLOAT, and DOUBLE, as well as the unsigned versions. I’ve
never had call for a 16 byte pixel, but it is supported. So really the
second interpretation is the same as the first, with an assumed bpp of 8,
which OpenGL doesn’t assume, because you must pass one of the type options
in. OpenGL’s support for smaller pixels is less flexible than SDL’s, if a
little more convenient to use (the GL_R3_G3_B2 etc constants), but it
clearly does exist.

To the maintainer of SDL_ttf, would you consider adding a function to
render fonts directly to an SDL_Surface passed as an argument? This would
simplify using SDL_ttf with OGL, and even if it only performed the above
steps internally for now, it could eventually be optimized to remove all
the redundant software blitting.

why, have you identified it as a bottleneck in your code? I would be very
surprised if caching your rendered strings or glyphs wouldn’t help

I guess what I’d really like is an interface to ttf that allows the
size and pixel format of the resulting surface to be specified. I know
nothing of the internals, so maybe this isn’t feasible, I’m hoping it is.
As for it being a bottleneck in my code: it is slightly, about 6% of total
execution time - and I’m not doing very much of it at all. I bet I could
speed it up a lot with pre-rendering, but that’s less fun than writing new
parts of the game :wink:

Jp CalderoneOn Mon, 29 Oct 2001, Mattias Engdegard wrote:

Jp Calderone wrote:

[…]- but I don’t follow
how you’ve decided that ARGB is as good or better than most.

actually Sam’s decision, but having RGB in the low 24 bits is not
uncommon in hardware which makes ARGB quick to alpha-blend onto a
screen surface

OpenGL doesn’t use exactly the second interpretation. If you use BYTE
or UNSIGNED_BYTE as the type it does.

same thing — the second interpretation uses the order of components
in memory, not the bit fields in a pixel value

As for it being a bottleneck in my code: it is slightly, about 6% of total
execution time - and I’m not doing very much of it at all. I bet I could
speed it up a lot with pre-rendering, but that’s less fun than writing new
parts of the game :wink:

and how fun do you think it is for us to compensate for your code?
I don’t use SDL_ttf at all, so if you want to improve it, send a patch

LordHavoc has reminded me that Win32 (particularly DX) uses ARGB for
everything as a Uint. Of course, wonce you accomidate the little endian
architecture, it becomes BGRA, which all cards do support since windoze
requires it in DX. It’s about as common as Multitexture on cards.On Mon, Oct 29, 2001 at 10:45:04PM -0500, Jp Calderone wrote:

Quite true that it has to pick one format - unless it renders to a given
SDL_Surface, in which case things will end up in whatever format the user
likes, probably at some performance penalty though - but I don’t follow
how you’ve decided that ARGB is as good or better than most. I have
always taken pixel format strings such as these to give the contents and
the order of the pixels, and this is the order I have always seen them
presented in. My experience in this area is probably more limited than
that of many on this list though, so if you could elaborate as to why
ARGB is superior to other formats, perhaps that would help my
understanding of things. Presently, I can only tell that this format
makes it difficult to use SDL_ttf output with OGL.


Joseph Carter Free software developer

  • SynrG notes that the number of configuration questions to answer in
    sendmail is NON-TRIVIAL

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20011030/1ae3304f/attachment.pgp