Truetype vs bitmapped fonts? Opinions?

How it is about performance? As far as i looked at current sdlttf lib
implementation there is this surface convertation (for me from
8bit->16bit). Doesn’t it take much time every time to create surface, draw
on it, convert, blit and finally destroy it, just to display one string of
text? Maybe somebody has rewritten it for use on 16 bit surface, which
just draws on existing surface, without creating it? Of course with
bitmapped fonts there is no antialiasing and fonts must be drawn.

Any opinions, advices?

Kovacs

Why not just convert all the surfaces to SDL_DisplayFormat()? This should
make everything as fast as possible.On Mon, 10 Apr 2000, you wrote:

How it is about performance? As far as i looked at current sdlttf lib
implementation there is this surface convertation (for me from
8bit->16bit). Doesn’t it take much time every time to create surface, draw
on it, convert, blit and finally destroy it, just to display one string of
text? Maybe somebody has rewritten it for use on 16 bit surface, which
just draws on existing surface, without creating it? Of course with
bitmapped fonts there is no antialiasing and fonts must be drawn.

Any opinions, advices?

Kovacs

-Garrett, WPI student majoring in Computer Science
"The fastest way to succeed is to look as if you’re playing by somebody
else’s rules, while quietly playing by your own." -Michael Konda

Yes with bitmapped fonts this is just normally, but i’m talking about true
type fonts. Take a look at sdlttf lib source & demo and you will see what
i’m talking about.

KovacsOn Mon, 10 Apr 2000, Garrett wrote:

Why not just convert all the surfaces to SDL_DisplayFormat()? This should
make everything as fast as possible.

On Mon, 10 Apr 2000, you wrote:

How it is about performance? As far as i looked at current sdlttf lib
implementation there is this surface convertation (for me from
8bit->16bit). Doesn’t it take much time every time to create surface, draw
on it, convert, blit and finally destroy it, just to display one string of
text? Maybe somebody has rewritten it for use on 16 bit surface, which
just draws on existing surface, without creating it? Of course with
bitmapped fonts there is no antialiasing and fonts must be drawn.

Any opinions, advices?

Kovacs


-Garrett, WPI student majoring in Computer Science
"The fastest way to succeed is to look as if you’re playing by somebody
else’s rules, while quietly playing by your own." -Michael Konda

Of course with
bitmapped fonts there is no antialiasing and fonts must be drawn.

Why not? Just use alpha channels. :slight_smile:

-bill!

What is the big need for ttf in SDL? I mean if its a game or something
that needs performance bitmaps fonts should be used.On Mon, 10 Apr 2000, William Kendrick wrote:

Of course with
bitmapped fonts there is no antialiasing and fonts must be drawn.

Why not? Just use alpha channels. :slight_smile:

-bill!

What is the big need for ttf in SDL? I mean if its a game or something
that needs performance bitmaps fonts should be used.

Bitmap fonts don’t scale worth beans. That said, using TTFs each time aren’t the right way to go.

Here’s the real best way to go, IMO, and an example of how I feel C++ and classes should be used. More on this sort of ranting
later, as if you haven’t heard enough!

Create a CFont object. When it initializes, have it choose a font, and a point size. This font’s constructor then renders letters using
SDL_ttf (or whatever) for each letter in the font that you’ll actually end up using (for easy use, always store your font
characters internally in some sort of useful format, like ASCII characters). Store 'em with alpha values, like Bill suggests, the
smart fellow that he is. Then, your code for rendering would be called something like:

myFont->printf(0,0,“Mord wuz here\n”);

which just goes through and blasts down the bitmaps. Then, if you need another font, allocate another CFont object. Best of both
worlds.

Nichol

Of course with
bitmapped fonts there is no antialiasing and fonts must be drawn.

Why not? Just use alpha channels. :slight_smile:
I use png files for that pupose RGBA works great!!! The file has built in
transparency so there is no need for the extra line of code to set the
color key!!

:slight_smile:

DaveOn Mon, 10 Apr 2000, William Kendrick wrote:

-bill!

Yes, this sounds the most reasonable solution. But how can i use per-pixel
alpha transparency (i’m using 16bpp)? There was one message about it, but
i could not clearly understand.

KovacsOn 10 Apr 2000 vining at pacificcoast.net wrote:

What is the big need for ttf in SDL? I mean if its a game or something
that needs performance bitmaps fonts should be used.

Bitmap fonts don’t scale worth beans. That said, using TTFs each time aren’t the right way to go.

Here’s the real best way to go, IMO, and an example of how I feel C++ and classes should be used. More on this sort of ranting
later, as if you haven’t heard enough!

Create a CFont object. When it initializes, have it choose a font, and a point size. This font’s constructor then renders letters using
SDL_ttf (or whatever) for each letter in the font that you’ll actually end up using (for easy use, always store your font
characters internally in some sort of useful format, like ASCII characters). Store 'em with alpha values, like Bill suggests, the
smart fellow that he is. Then, your code for rendering would be called something like:

myFont->printf(0,0,“Mord wuz here\n”);

which just goes through and blasts down the bitmaps. Then, if you need another font, allocate another CFont object. Best of both
worlds.

Nichol

Remember 16bpp color is the same (on the surface) as a Uint16, so you
can compare it to a Uin16 value. However in the book, “Real Time
Stragegy Game Programming using MS DX6” there is a great example of how
to compress 16bpp sprite. This method has a great feature of
compressing out transparent parts of the image.

			-fjr

Kovacs wrote:>

Yes, this sounds the most reasonable solution. But how can i use per-pixel
alpha transparency (i’m using 16bpp)? There was one message about it, but
i could not clearly understand.

    Kovacs

On 10 Apr 2000 vining at pacificcoast.net wrote:

What is the big need for ttf in SDL? I mean if its a game or something
that needs performance bitmaps fonts should be used.

Bitmap fonts don’t scale worth beans. That said, using TTFs each time aren’t the right way to go.

Here’s the real best way to go, IMO, and an example of how I feel C++ and classes should be used. More on this sort of ranting
later, as if you haven’t heard enough!

Create a CFont object. When it initializes, have it choose a font, and a point size. This font’s constructor then renders letters using
SDL_ttf (or whatever) for each letter in the font that you’ll actually end up using (for easy use, always store your font
characters internally in some sort of useful format, like ASCII characters). Store 'em with alpha values, like Bill suggests, the
smart fellow that he is. Then, your code for rendering would be called something like:

myFont->printf(0,0,“Mord wuz here\n”);

which just goes through and blasts down the bitmaps. Then, if you need another font, allocate another CFont object. Best of both
worlds.

Nichol

You mean RLE compression? I’m talking not about one transparency value for
all surface, but about transparecy value for each non-color-keyed pixel,
and i’m curios is this supported by hardware or must be written by hand.

KovacsOn Tue, 11 Apr 2000, Frank J. Ramsay wrote:

Remember 16bpp color is the same (on the surface) as a Uint16, so you
can compare it to a Uin16 value. However in the book, “Real Time
Stragegy Game Programming using MS DX6” there is a great example of how
to compress 16bpp sprite. This method has a great feature of
compressing out transparent parts of the image.

  		-fjr

Hi,
Why does the SDL audio stop playing in Win32 when the window
loses focus, and how can I get rid of this behavior? Thanks…
-The Mighty Mike Master
http://tmmm.simplenet.com

No, it’s not RLE compression… the author of the book refers to it
as LLE compression. I’ll try and explain it.

If a scanline of the bitmap is:

  • completely transparent just store a single ‘0’ for it
  • completely solid, store a 1 followed by the line data
  • a mix of solid and transparent information store:
    • The number of ‘runs’ (a run being a series of either
      solid pixels) + 1
    • then the decompressed offset of each run
    • then the length of each run.
    • then the data for the solid runs.
      (I don’t remember how the transparent parts are
      decoded, but it works)

So when it is blitted, if it the loop hits a ‘0’ for the first byte
of the line it just skips the line. If it hits a ‘1’ it just memcpy
the entire row. If it his a number greater than 1, it blits each run
seperatly with a memcpy.
While this may sound slow, it’s a lot faster than checking each
pixel individually.

			-fjr

Kovacs wrote:>

You mean RLE compression? I’m talking not about one transparency value for
all surface, but about transparecy value for each non-color-keyed pixel,
and i’m curios is this supported by hardware or must be written by hand.

    Kovacs

On Tue, 11 Apr 2000, Frank J. Ramsay wrote:

Remember 16bpp color is the same (on the surface) as a Uint16, so you
can compare it to a Uin16 value. However in the book, “Real Time
Stragegy Game Programming using MS DX6” there is a great example of how
to compress 16bpp sprite. This method has a great feature of
compressing out transparent parts of the image.

                          -fjr

No, it’s not RLE compression… the author of the book refers to it
as LLE compression. I’ll try and explain it.

If a scanline of the bitmap is:

  • completely transparent just store a single ‘0’ for it
  • completely solid, store a 1 followed by the line data
  • a mix of solid and transparent information store:
    […]

This is essentially exactly what SDL does with (misnamed) RLE colorkey
blit acceleration. The trickiest part is dynamically recompiling the
sprite to accomodate clipping (and it’s still faster than testing each
individual pixel.)

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

This is essentially exactly what SDL does with (misnamed) RLE colorkey
blit acceleration. The trickiest part is dynamically recompiling the
sprite to accomodate clipping (and it’s still faster than testing each
individual pixel.)

Have you benchmarked this so you know it’s faster to recompile and then
blit than clipping it while drawing? That is what I do, and I’m a little
ashamed not having timed it. If you know you’re going to blit the same
sprite at the same position more than once it probably pays off, but
then you’ll have to save the clipped sprite somewhere.

I suppose it’s not a big deal since most sprites are not clipped,
unless you have huge sprites (boss monsters :slight_smile:

Have you benchmarked this so you know it’s faster to recompile and then
blit than clipping it while drawing?

Well, the clipped case is faster if you clip during drawing, but the
normal case is slower. I figured the hit in recompiling was more than
made up for by the speed increase for the unclipped case.

-Sam Lantinga, Lead Programmer, Loki Entertainment Software

Well, the clipped case is faster if you clip during drawing, but the
normal case is slower. I figured the hit in recompiling was more than
made up for by the speed increase for the unclipped case.

Even with a separate paint_clipped_sprite()?

Well, the clipped case is faster if you clip during drawing, but the
normal case is slower. I figured the hit in recompiling was more than
made up for by the speed increase for the unclipped case.

Even with a separate paint_clipped_sprite()?

Mmmm, no. :slight_smile:

Hmm, want to improve SDL_RLEaccel.c? :slight_smile:

See ya!
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

Even with a separate paint_clipped_sprite()?

Mmmm, no. :slight_smile:

Hmm, want to improve SDL_RLEaccel.c? :slight_smile:

If I get some time, I might :slight_smile: But I’d have to do some extensive timing
to make sure it’s worth it.

Also I observed that SDL interleaves pixel data and opcodes in the same byte
stream. In my code I put the opcodes in a separate sequence, so that the
pixel data will always be aligned. It probably eats another register when
decoding, but I figured the raw data copy would be faster. I need to do more
timing I guess :slight_smile:

– Mattias

Hi,
Why does the SDL audio stop playing in Win32 when the window
loses focus, and how can I get rid of this behavior? Thanks…

This is because SDL is writing directly to the audio hardware, and when
the window loses focus, DirectX removes exclusive access to the audio.
If you want to see how DirectSound performs without primary write access,
just comment out the line “CreatePrimaryBuffer” (or something like that)
in SDL_dx5audio.c

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

Mattias Engdeg?rd wrote:

Even with a separate paint_clipped_sprite()?

Mmmm, no. :slight_smile:

Hmm, want to improve SDL_RLEaccel.c? :slight_smile:

If I get some time, I might :slight_smile: But I’d have to do some extensive timing
to make sure it’s worth it.

Also I observed that SDL interleaves pixel data and opcodes in the same byte
stream. In my code I put the opcodes in a separate sequence, so that the
pixel data will always be aligned. It probably eats another register when
decoding, but I figured the raw data copy would be faster. I need to do more
timing I guess :slight_smile:

I think there are many ways todo RL encoded sprites.

I use no opcodes. I store only lengths:
[SKIP-LENGTH] [COPY-LENGHT] [DATA…] [SKIP-LENGTH] [COPY-LENGHT] [DATA…]

It saves the space and the test of the opcodes.
If you have more then 255 bytes of transparent just make an empty copy area.
The pixels data is always aligned. But an extra test is needed to detect the
end of line.

It would be nice to see more ideas or implementations.

Bye.
Johns–
Become famous, earn no money, create graphics for FreeCraft.

http://FreeCraft.Org - A free fantasy real-time strategy game engine
http://fgp.cjb.net - The FreeCraft Graphics Project