Bits per pixel in openGL

I apologize if this seems like an obvious question.

If I want to use OpenGL in SDL, the maximum bits per pixel I can use seems to
be 16. Is this an SDL limitation or an OpenGL limitation ?

Thanks.

Ciao,–
Valerio Luccio - NYU, Center for Neural Science
Phone: +1 212 9983936 Fax: +1 212 9954011
URL : http://cns.nyu.edu/~valerio

    "In an open world, who needs windows or gates?"

AFAIK, the limit is 32 bits, but what you can actually get depends on the
hardware. Some cards can’t accelerate anyting but 16 bit modes, and the
drivers may simply refuse to set anything else up, as that would require
software rasterization to work.

BTW, 32 bits (actually 24 bits aligned to 32, I guess) works fine on my
Matrox G400 using SDL, Utah-GLX and XFree86 on GNU/Linux.

//David

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------> http://www.linuxaudiodev.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |--------------------------------------> david at linuxdj.com -'On Tuesday 20 February 2001 23:54, Valerio Luccio wrote:

I apologize if this seems like an obvious question.

If I want to use OpenGL in SDL, the maximum bits per pixel I can use seems
to be 16. Is this an SDL limitation or an OpenGL limitation ?

David Olofson wrote:> On Tuesday 20 February 2001 23:54, Valerio Luccio wrote:

I apologize if this seems like an obvious question.

If I want to use OpenGL in SDL, the maximum bits per pixel I can use seems
to be 16. Is this an SDL limitation or an OpenGL limitation ?

AFAIK, the limit is 32 bits, but what you can actually get depends on the
hardware. Some cards can’t accelerate anyting but 16 bit modes, and the
drivers may simply refuse to set anything else up, as that would require
software rasterization to work.

BTW, 32 bits (actually 24 bits aligned to 32, I guess) works fine on my
Matrox G400 using SDL, Utah-GLX and XFree86 on GNU/Linux.

Sorry, but that’s not the case. I’m running on an SGI Octane with an
ESI graphics board.

When I list the available GL visuals, I have, among others, a visual with 8 bpp
for each RGBA and a depth of 24 and another with 12 bpp for each RGB and a
depth of 24.

There must be something else that SDL sets and that the SGI getvisual doesn’t
like. I’ll dig a little further into it and I’ll let you know. (Maybe this is
one of those things that keep SDL from being still beta on SGI).

Thanks,


Valerio Luccio - NYU, Center for Neural Science
Phone: +1 212 9983936 Fax: +1 212 9954011
URL : http://cns.nyu.edu/~valerio

    "In an open world, who needs windows or gates?"

OK, I take it all back. It turns out that I was trying either 16 or 32 bits of
depth, but not 24. So, if I set the RGB to 12 pixels each and the depth to 24
it works on my SGI.

Now for another stupid question. If I set the SDL_Surface to 32 bpp, but I have
the GL visual set at 12/12/12 bpp (therefore 36 bits), how is it handled ? Am I
missing something ?

Thanks.

Ciao,

PS: I’m more than happy to help test out SDL on SGI Irix. We plan to use it for
an application that needs to run on Irix, Linux, Windos and MacOS.> On Tuesday 20 February 2001 23:54, Valerio Luccio wrote:

I apologize if this seems like an obvious question.

If I want to use OpenGL in SDL, the maximum bits per pixel I can use seems
to be 16. Is this an SDL limitation or an OpenGL limitation ?


Valerio Luccio - NYU, Center for Neural Science
Phone: +1 212 9983936 Fax: +1 212 9954011
URL : http://cns.nyu.edu/~valerio

    "In an open world, who needs windows or gates?"

David Olofson wrote:

I apologize if this seems like an obvious question.

If I want to use OpenGL in SDL, the maximum bits per pixel I can use
seems to be 16. Is this an SDL limitation or an OpenGL limitation ?

AFAIK, the limit is 32 bits, but what you can actually get depends on the
hardware. Some cards can’t accelerate anyting but 16 bit modes, and the
drivers may simply refuse to set anything else up, as that would require
software rasterization to work.

BTW, 32 bits (actually 24 bits aligned to 32, I guess) works fine on my
Matrox G400 using SDL, Utah-GLX and XFree86 on GNU/Linux.

Sorry, but that’s not the case. I’m running on an SGI Octane with an
ESI graphics board.

When I list the available GL visuals, I have, among others, a visual with 8
bpp for each RGBA and a depth of 24 and another with 12 bpp for each RGB
and a depth of 24.

How is that possible…?

RGBA8 = 8:8:8:8 = 32 bits.

RGB12 = 12:12:12 = 36 bits.

There is something weird going on here - and I’d guess it confuses SDL at
least as much as it confuses me! :slight_smile:

//David

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------> http://www.linuxaudiodev.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |--------------------------------------> david at linuxdj.com -'On Wednesday 21 February 2001 18:22, Valerio Luccio wrote:

On Tuesday 20 February 2001 23:54, Valerio Luccio wrote:

David Olofson wrote:

How is that possible…?

RGBA8 = 8:8:8:8 = 32 bits.

RGB12 = 12:12:12 = 36 bits.

There is something weird going on here - and I’d guess it confuses SDL at
least as much as it confuses me! :slight_smile:

That’s correct. High-end graphics cards, like the ones in the SGIs, allow 12
bits per gun. In the PC world they are rare (but I do believe they exist). The
question is: what does SDL do with it ? I haven’t had a chance to dig into the
source code yet, but I would like to know if anyone knows the answer in advance.

In the applications we use, we need, at a minimum, 10 bits per gun (right now we
are using video attenuators, but we would like to get away from them).

Thanks.

Ciao,–
Valerio Luccio - NYU, Center for Neural Science
Phone: +1 212 9983936 Fax: +1 212 9954011
URL : http://cns.nyu.edu/~valerio

    "In an open world, who needs windows or gates?"

That’s correct. High-end graphics cards, like the ones in the SGIs, allow 12
bits per gun. In the PC world they are rare (but I do believe they exist). The
question is: what does SDL do with it ? I haven’t had a chance to dig into the
source code yet, but I would like to know if anyone knows the answer in advance.

I believe SDL will allow this, since it’s using GLX to choose the visual.
SDL imposes no constraints on the GL visual since it’s not being used for 2D.

See ya,
-Sam Lantinga, Lead Programmer, Loki Entertainment Software

David Olofson wrote:

How is that possible…?

RGBA8 = 8:8:8:8 = 32 bits.

RGB12 = 12:12:12 = 36 bits.

There is something weird going on here - and I’d guess it confuses SDL at
least as much as it confuses me! :slight_smile:

That’s correct. High-end graphics cards, like the ones in the SGIs, allow
12 bits per gun. In the PC world they are rare (but I do believe they
exist). The question is: what does SDL do with it ? I haven’t had a chance
to dig into the source code yet, but I would like to know if anyone knows
the answer in advance.

There’s nothing weird about cards with high resolution RAMDACs (except the
price tags on them… :wink: - but there is something weird about the channel
vs. total pixel sizes.

How are you going to fit 12 bit RGB in 24 bits pixel? The only thing I can
think of is some kind of RG/BG alternating scheme or something like that, but
I wouldn’t expect to see that on a high end card… (It’s used for things
like webcams to preserve bandwidth and improved CCD chip resolution/cost at
the expense of color resolution.)

In the applications we use, we need, at a minimum, 10 bits per gun (right
now we are using video attenuators, but we would like to get away from
them).

RGB 10:10:10 is a “standard” 30(32) bit pixel format AFAIK, but I wouldn’t
bet on your card using anything like that. Does the driver actually claim
that it’s using normal, packed pixel formats?

//David

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------> http://www.linuxaudiodev.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |--------------------------------------> david at linuxdj.com -'On Thursday 22 February 2001 00:29, Valerio Luccio wrote:

David Olofson wrote:

There’s nothing weird about cards with high resolution RAMDACs (except the
price tags on them… :wink: - but there is something weird about the channel
vs. total pixel sizes.

How are you going to fit 12 bit RGB in 24 bits pixel? The only thing I can
think of is some kind of RG/BG alternating scheme or something like that, but
I wouldn’t expect to see that on a high end card… (It’s used for things
like webcams to preserve bandwidth and improved CCD chip resolution/cost at
the expense of color resolution.)

Why do you have to fit them into 24 bits/pixel ? The way I understand it, these cards
don’t use 24 bits/pixel.

In the applications we use, we need, at a minimum, 10 bits per gun (right
now we are using video attenuators, but we would like to get away from
them).

RGB 10:10:10 is a “standard” 30(32) bit pixel format AFAIK, but I wouldn’t
bet on your card using anything like that. Does the driver actually claim
that it’s using normal, packed pixel formats?

What we use now are video attenuators which, in hardware, modify the RGB output to
give us more steps on the green and blue channels (I’m not sure how they work
exactly). We are trying to go to more “standard” solutions and we want to make sure
that SDL will either deliver or at least not interfere with what we are doing.

Sam Lantinga wrote:

That’s correct. High-end graphics cards, like the ones in the SGIs, allow 12
bits per gun. In the PC world they are rare (but I do believe they exist). The
question is: what does SDL do with it ? I haven’t had a chance to dig into the
source code yet, but I would like to know if anyone knows the answer in advance.

I believe SDL will allow this, since it’s using GLX to choose the visual.
SDL imposes no constraints on the GL visual since it’s not being used for 2D.

So, if I understand correctly, if you chose a GL visual, SDL let’s GL deal with the
image buffer ? Then does it make a difference what you chose for bpp when using
SDL_SetVideoMode ?

Maybe I’m missing something here.

Thanks.

Ciao,–
Valerio Luccio - NYU, Center for Neural Science
Phone: +1 212 9983936 Fax: +1 212 9954011
URL : http://cns.nyu.edu/~valerio

    "In an open world, who needs windows or gates?"

David Olofson wrote:

There’s nothing weird about cards with high resolution RAMDACs
(except the price tags on them… :wink: - but there is something
weird about the channel vs. total pixel sizes.

How are you going to fit 12 bit RGB in 24 bits pixel? The only
thing I can think of is some kind of RG/BG alternating scheme or
something like that, but I wouldn’t expect to see that on a high
end card… (It’s used for things like webcams to preserve
bandwidth and improved CCD chip resolution/cost at the expense of
color resolution.)

Why do you have to fit them into 24 bits/pixel ? The way I
understand it, these cards don’t use 24 bits/pixel.

That’s what I thought, but the figures from your X server indicate
something else - that’s the reason why I’m confused.

In the applications we use, we need, at a minimum, 10 bits per
gun (right now we are using video attenuators, but we would
like to get away from them).

RGB 10:10:10 is a “standard” 30(32) bit pixel format AFAIK, but I
wouldn’t bet on your card using anything like that. Does the
driver actually claim that it’s using normal, packed pixel
formats?

What we use now are video attenuators which, in hardware, modify
the RGB output to give us more steps on the green and blue channels
(I’m not sure how they work exactly). We are trying to go to more
"standard" solutions and we want to make sure that SDL will either
deliver or at least not interfere with what we are doing.

Ok, I would think so too, as it’s actually the OpenGL driver that
does the job…

//David

.- M A I A -------------------------------------------------.
| Multimedia Application Integration Architecture |
| A Free/Open Source Plugin API for Professional Multimedia |
----------------------> http://www.linuxaudiodev.com/maia -' .- David Olofson -------------------------------------------. | Audio Hacker - Open Source Advocate - Singer - Songwriter |--------------------------------------> david at linuxdj.com -'On Thursday 22 February 2001 19:02, Valerio Luccio wrote: