Question about texture size

Hi people,

I am writing this 2D-platform game using OpenGL and stuff, and I want to
store all ‘tiles’ to be ‘blitted’ onto the screen into one big texture.
This way I can use only one texture while drawing the game’s background
which should be really fast. So far, the theory. Now, the question:

I am thinking about using a 2048x64 or something as this texture, and I
know that the later graphic cards (like TNT 2, GeForce etc.) will
support this texture size, but Voodoo 1, 2 and 3 will not. So I would
render these cards useless. Does someone have a though regarding a good
average texture size to use to keep maximum compatibility? (256x256,
which is the max of Voodoo as far as I know, will not work for me, I
do need a bigger texture for this.

Thanks for any help!
With kind regards,
Martijn Melenhorst

why not just have several 256x256 textures and keep track of which has what?
Unless theres some reason why you can only use a single texture it seems
like that would be a pretty ok way to go, the code would just be a little
bit more complicated (a very weee bit).> ----- Original Message -----

From: rheenen@home.nl (Martijn Melenhorst))
To:
Sent: Thursday, July 04, 2002 10:45 AM
Subject: [SDL] Question about texture size

Hi people,

I am writing this 2D-platform game using OpenGL and stuff, and I want to
store all ‘tiles’ to be ‘blitted’ onto the screen into one big texture.
This way I can use only one texture while drawing the game’s background
which should be really fast. So far, the theory. Now, the question:

I am thinking about using a 2048x64 or something as this texture, and I
know that the later graphic cards (like TNT 2, GeForce etc.) will
support this texture size, but Voodoo 1, 2 and 3 will not. So I would
render these cards useless. Does someone have a though regarding a good
average texture size to use to keep maximum compatibility? (256x256,
which is the max of Voodoo as far as I know, will not work for me, I
do need a bigger texture for this.

Thanks for any help!
With kind regards,
Martijn Melenhorst


SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Hi people,

I am writing this 2D-platform game using OpenGL and stuff, and I want to
store all ‘tiles’ to be ‘blitted’ onto the screen into one big texture.
This way I can use only one texture while drawing the game’s background
which should be really fast. So far, the theory. Now, the question:

I’m not sure if it actually matters that much.

I did some tests with glSDL (which does no texture binding
optimizations at all), and only ended up with slower rendering
on the G400. Texture switches seem to cost virtually nothing on
that card.

That said, it might be that texture binding actually has a
significant cost on older cards - on which you can’t avoid it,
of course. heh

I am thinking about using a 2048x64 or something as this texture, and I

It might be a better idea to stick with square textures, especially
since you seem to worry about older and/or lower end cards…

know that the later graphic cards (like TNT 2, GeForce etc.) will
support this texture size, but Voodoo 1, 2 and 3 will not. So I would
render these cards useless. Does someone have a though regarding a good
average texture size to use to keep maximum compatibility? (256x256,
which is the max of Voodoo as far as I know, will not work for me, I
do need a bigger texture for this.

2048x2048 is the new “standard”, but I’ve seen a few drivers that
are restricted to 1024x1024 for some reason, despite the hardware
supporting 2048x2048.

Anyway, I think relying on any specific texture size being supported
is a bad idea, and I can’t see a real motivation to do it -
especially not when rendering tiled backgrounds. It’s quite trivial
to reduce texture switches to MAX(tiles_on_screen, tile_textures).

The easiest way I can think of would be something like:

for(texture in tile_textures)
{
bound = 0;
for(y in screen_rows)
for(x in screen_columns)
{
if(tiles[map(x, y)].texture == texture)
{
if(!bound)
{
glBindTexture(texture);
bound = 1;
}
render_tile(tiles[map(x, y)], x, y);
}
}
}

Obviously, there are more efficient (and more complicated…)
ways of doing it, but I doubt the overhead is significant unless
you have extremely small tiles.

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson at reologica.se
Address:
REOLOGICA Instruments AB
Scheelev?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson at reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology RealOn Thu, 04/07/2002 19:45:42 , Martijn Melenhorst (Prive) wrote:

why not just have several 256x256 textures and keep track of which has what?

Now that you mention it…

Unless theres some reason why you can only use a single texture it seems
like that would be a pretty ok way to go, the code would just be a little
bit more complicated (a very weee bit).

…there’s an even simpler way than what I proposed, which probably
worst just as well, at least if you use some serious graphics. (By
"serious graphics", I mean artwork that makes the levels look more
like large pictures than tiled maps. Lots of tiles that are crafted
to fit perfectly together to form larger objects, that is.)

Just put tiles that are normally used together on the same texture.
Then do the obvious

if(texture != current_texture)
{
    glBindTexture(texture);
    current_texture = texture;
}

inside the plain “render all tiles on screen” loop.

You may improve things further by not rendering row by row, but
rather in blocks of 4x4 tiles or something. (In the average case,
I’d think a 4x4 area is more likely to contain mostly "related"
tiles than a 16x1 area.)

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson at reologica.se
Address:
REOLOGICA Instruments AB
Scheelev?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson at reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology RealOn Thu, 4/07/2002 12:07:03 , Atrix Wolfe wrote:

Everything but the early 3dfx cards supports 2048x2048. If you don’t care
to support these cards (specifically 3dfx < Voodoo4 in Linux and < Voodoo3
in win32) then you’re all set.

Otherwise you’ll need more than one texture. Note, texture changes are
pretty fast once the textures themselves are resident. If you have more
textures than will fit resident, things get interesting.On Thu, Jul 04, 2002 at 07:45:42PM +0200, Martijn Melenhorst (Prive) wrote:

I am writing this 2D-platform game using OpenGL and stuff, and I want to
store all ‘tiles’ to be ‘blitted’ onto the screen into one big texture.
This way I can use only one texture while drawing the game’s background
which should be really fast. So far, the theory. Now, the question:

I am thinking about using a 2048x64 or something as this texture, and I
know that the later graphic cards (like TNT 2, GeForce etc.) will
support this texture size, but Voodoo 1, 2 and 3 will not. So I would
render these cards useless. Does someone have a though regarding a good
average texture size to use to keep maximum compatibility? (256x256,
which is the max of Voodoo as far as I know, will not work for me, I
do need a bigger texture for this.


Joseph Carter Not many fishes

LordHavoc: The reason why GL has overdraw is because it is only
using HALF of the system they designed for vis.
LordHavoc: Shooting itself in the foot.

  • Dabb looks at all those bullet holes in his shoes - damn, lots :slight_smile:

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20020704/31c706b7/attachment.pgp

[…]

Everything but the early 3dfx cards

  • and Matrox G400 on Linux, even with reasonably new drivers -

supports 2048x2048.

Actually, it won’t refuse to “deal with” 2048x2048, but if you ask,
the driver says 1024x1024, and 2048x2048 can do anything from
producing incorrect results to crashing X.

One would assume that this is fixed in later drivers, as 2048x2048
works on Win32, and for all other matters, the Win32 and Linux
drivers seem to have exactly the same flaws. (Like not being able
to handle multiple contexts… grrrr :frowning: )

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson at reologica.se
Address:
REOLOGICA Instruments AB
Scheelev?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson at reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology RealOn Thu, 4/07/2002 13:06:51 , Joseph Carter wrote:

The whole point of having one texture to deal with while ‘blitting’ the
entire screen full is, to not have to switch between two or more textures
while doing so. This makes the difference between 120 and 260 fps for
me(!). So I am looking for a way to store all my ‘blocks’ into one
texture, and then letting the tile-routine have its (corrected by DJ JRN)
go with it.

Hope that helps.

Atrix Wolfe wrote:

why not just have several 256x256 textures and keep track of which has what?
Unless theres some reason why you can only use a single texture it seems
like that would be a pretty ok way to go, the code would just be a little
bit more complicated (a very weee bit).----- Original Message -----
From: "Martijn Melenhorst (Prive)"
To:
Sent: Thursday, July 04, 2002 10:45 AM
Subject: [SDL] Question about texture size

Hi people,

I am writing this 2D-platform game using OpenGL and stuff, and I want to
store all ‘tiles’ to be ‘blitted’ onto the screen into one big texture.
This way I can use only one texture while drawing the game’s background
which should be really fast. So far, the theory. Now, the question:

I am thinking about using a 2048x64 or something as this texture, and I
know that the later graphic cards (like TNT 2, GeForce etc.) will
support this texture size, but Voodoo 1, 2 and 3 will not. So I would
render these cards useless. Does someone have a though regarding a good
average texture size to use to keep maximum compatibility? (256x256,
which is the max of Voodoo as far as I know, will not work for me, I
do need a bigger texture for this.

Thanks for any help!
With kind regards,
Martijn Melenhorst


SDL mailing list
SDL@libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl


SDL mailing list
SDL@libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Ofcourse relying on any texture size is something you need to do, even
if it’s 32x32 pixels. My question concerned what texture size to depend
on, then, since the relevance can not be avoided. Strange, by the way,
that the G400 would not have hard times with this, because my GeForce 2
card sure does slow down if I keep switching textures for every blit. Not
that this is something that I’d normally do, but it proves that it does
cause a slowdown…

David Olofson wrote:On Thu, 04/07/2002 19:45:42 , Martijn Melenhorst (Prive) wrote:

Hi people,

I am writing this 2D-platform game using OpenGL and stuff, and I want to
store all ‘tiles’ to be ‘blitted’ onto the screen into one big texture.
This way I can use only one texture while drawing the game’s background
which should be really fast. So far, the theory. Now, the question:

I’m not sure if it actually matters that much.

I did some tests with glSDL (which does no texture binding
optimizations at all), and only ended up with slower rendering
on the G400. Texture switches seem to cost virtually nothing on
that card.

That said, it might be that texture binding actually has a
significant cost on older cards - on which you can’t avoid it,
of course. heh

I am thinking about using a 2048x64 or something as this texture, and I

It might be a better idea to stick with square textures, especially
since you seem to worry about older and/or lower end cards…

know that the later graphic cards (like TNT 2, GeForce etc.) will
support this texture size, but Voodoo 1, 2 and 3 will not. So I would
render these cards useless. Does someone have a though regarding a good
average texture size to use to keep maximum compatibility? (256x256,
which is the max of Voodoo as far as I know, will not work for me, I
do need a bigger texture for this.

2048x2048 is the new “standard”, but I’ve seen a few drivers that
are restricted to 1024x1024 for some reason, despite the hardware
supporting 2048x2048.

Anyway, I think relying on any specific texture size being supported
is a bad idea, and I can’t see a real motivation to do it -
especially not when rendering tiled backgrounds. It’s quite trivial
to reduce texture switches to MAX(tiles_on_screen, tile_textures).

The easiest way I can think of would be something like:

for(texture in tile_textures)
{
bound = 0;
for(y in screen_rows)
for(x in screen_columns)
{
if(tiles[map(x, y)].texture == texture)
{
if(!bound)
{
glBindTexture(texture);
bound = 1;
}
render_tile(tiles[map(x, y)], x, y);
}
}
}

Obviously, there are more efficient (and more complicated…)
ways of doing it, but I doubt the overhead is significant unless
you have extremely small tiles.

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson@reologica.se
Address:
REOLOGICA Instruments AB
ScheelevA?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson@reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology Real


SDL mailing list
SDL@libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Thanks for the answer. I guess I’ll go for the 2048x2048 option then, sod
those Voodoo users :slight_smile:

I guess NVidia TNT 1 will do 2048x2048 too, so then I’ve got most users
covered. Otherwise, they can always revert back to software-emulation and
buy a really faaaassstttt procesor :)))))))

Okay, anyways, thanks.

Joseph Carter wrote:On Thu, Jul 04, 2002 at 07:45:42PM +0200, Martijn Melenhorst (Prive) wrote:

I am writing this 2D-platform game using OpenGL and stuff, and I want to
store all ‘tiles’ to be ‘blitted’ onto the screen into one big texture.
This way I can use only one texture while drawing the game’s background
which should be really fast. So far, the theory. Now, the question:

I am thinking about using a 2048x64 or something as this texture, and I
know that the later graphic cards (like TNT 2, GeForce etc.) will
support this texture size, but Voodoo 1, 2 and 3 will not. So I would
render these cards useless. Does someone have a though regarding a good
average texture size to use to keep maximum compatibility? (256x256,
which is the max of Voodoo as far as I know, will not work for me, I
do need a bigger texture for this.

Everything but the early 3dfx cards supports 2048x2048. If you don’t care
to support these cards (specifically 3dfx < Voodoo4 in Linux and < Voodoo3
in win32) then you’re all set.

Otherwise you’ll need more than one texture. Note, texture changes are
pretty fast once the textures themselves are resident. If you have more
textures than will fit resident, things get interesting.

No, it was only fixed in Win32 for Voodoo3 (Voodoo4/5 do not have the
limitation…) It’s still borked for Voodoo2 and under, and AFAIK, nobody
ever ported the workaround to Linux.On Thu, Jul 04, 2002 at 10:57:38PM +0200, David Olofson wrote:

[…]

Everything but the early 3dfx cards

  • and Matrox G400 on Linux, even with reasonably new drivers -

supports 2048x2048.

Actually, it won’t refuse to “deal with” 2048x2048, but if you ask,
the driver says 1024x1024, and 2048x2048 can do anything from
producing incorrect results to crashing X.

One would assume that this is fixed in later drivers, as 2048x2048
works on Win32, and for all other matters, the Win32 and Linux
drivers seem to have exactly the same flaws. (Like not being able
to handle multiple contexts… grrrr :frowning: )


Joseph Carter This thing is an AI

Feanor - license issues are important. If we don’t watch our
arses now, someone’s gonna come up and bite us later…

-------------- next part --------------
A non-text attachment was scrubbed…
Name: not available
Type: application/pgp-signature
Size: 273 bytes
Desc: not available
URL: http://lists.libsdl.org/pipermail/sdl-libsdl.org/attachments/20020705/52a84060/attachment.pgp

Ofcourse relying on any texture size is something you need to do,
even if it’s 32x32 pixels.

Of course - that’s not really what I meant. (Although it’s kind of
cool to fake a max texture size of 32x32 and still have the code
work properly. :slight_smile:

My question concerned what texture
size to depend on, then, since the relevance can not be avoided.

256x256. IIRC, it’s a requirement for OpenGL compliance.

Strange, by the way, that the G400 would not have hard times with
this, because my GeForce 2 card sure does slow down if I keep
switching textures for every blit. Not that this is something
that I’d normally do, but it proves that it does cause a
slowdown…

Yeah, one would have thought that nVidia should have faster
drivers and/or hardware… Either nVidia somehow optimizes the
general case at the expense of texture switches, or Matrox is
doing something stupid that makes the G400 driver hit the
texture binding overhead for every polygon, whether you change
the texture or not. Or it’s just some very interesting bug in
my code! :slight_smile:

My benchmark results were weird enough that I should really try to
find out what actually happened.

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson at reologica.se
Address:
REOLOGICA Instruments AB
Scheelev?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson at reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology RealOn Thu, 04/07/2002 22:06:02 , Martijn Melenhorst (Prive) wrote:

[…G400 and Voodo issues…]

No, it was only fixed in Win32 for Voodoo3 (Voodoo4/5 do not have the
limitation…) It’s still borked for Voodoo2 and under, and AFAIK, nobody
ever ported the workaround to Linux.

Oh, I was talking about the G400 :slight_smile: - but it might apply to the
G400 as well. (At least, I’ve never seen 2048x2048 work on a G400
on Linux…)

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson at reologica.se
Address:
REOLOGICA Instruments AB
Scheelev?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson at reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology RealOn Fri, 5/07/2002 02:38:12 , Joseph Carter wrote:

On Thu, Jul 04, 2002 at 10:57:38PM +0200, David Olofson wrote:

Just for the people who did not know, as it was not mentioned in this
threat before, it seems that, regardless of which size you are going to
use, you should stick to a ‘power of 2’ square size. So: 256x256, 512x512
or 1024x1024 will do, but, for example, 768x768 will not (although it’s a
multiple of 32 which always make sense computerwise ;P).

This is too bad for me, since I am forced to use a 1024x1024 texture for a
texture that’s actually 960x960 pixels. A waste of graphics mem, I would
say, but I’ll survive, since the game is running in 16-bits mode, the
texture can be stored as A1R5G5B5, which runs really well over here.

Just thought y’all liked to know.

Thanks for all your help also, so far.

David Olofson wrote:On Fri, 5/07/2002 02:38:12 , Joseph Carter wrote:

On Thu, Jul 04, 2002 at 10:57:38PM +0200, David Olofson wrote:

[…G400 and Voodo issues…]

No, it was only fixed in Win32 for Voodoo3 (Voodoo4/5 do not have the
limitation…) It’s still borked for Voodoo2 and under, and AFAIK, nobody
ever ported the workaround to Linux.

Oh, I was talking about the G400 :slight_smile: - but it might apply to the
G400 as well. (At least, I’ve never seen 2048x2048 work on a G400
on Linux…)

//David

.---------------------------------------
| David Olofson
| Programmer

david.olofson@reologica.se
Address:
REOLOGICA Instruments AB
ScheelevA?gen 30
223 63 LUND
Sweden
---------------------------------------
Phone: 046-12 77 60
Fax: 046-12 50 57
Mobil:
E-mail: david.olofson@reologica.se
WWW: http://www.reologica.se

`-----> We Make Rheology Real


SDL mailing list
SDL@libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl

Martijn Melenhorst (Prive) wrote:

Just for the people who did not know, as it was not mentioned in this
threat before, it seems that, regardless of which size you are going
to use, you should stick to a ‘power of 2’ square size.

it doesn’t have to be square, just both sides are powers of 2.–
-==-
Jon Atkins
http://jonatkins.org/

Two things about this:

Some OpenGL video cards (I can’t remember which, but at one point it
caused us problems) have a limitation that textures can have an aspect
ratio of no worse than 1:8. In other words, if one dimension is 2048,
the other one has to be no smaller than 256.

The other thing is that it’s not that hard to ask which video card
you’re using – and if the card can’t handle big textures, load a
smaller version (either decimate it on the fly or have two versions
on disk). Don’t forget that a 2048x2048x24 texture is 6MB all by
itself. If you have more than one of them you’re really going to be
pushing a video card.

    Kent

Saturday, July 6, 2002, 1:59:10 PM, you wrote:

JA> Martijn Melenhorst (Prive) wrote:

Just for the people who did not know, as it was not mentioned in this
threat before, it seems that, regardless of which size you are going
to use, you should stick to a ‘power of 2’ square size.

JA> it doesn’t have to be square, just both sides are powers of 2.–
Kent Quirk, CTO, CogniToy
@Kent_Quirk
http://www.cognitoy.com