Pixel size oddity between machines/ OSes

I’m in the middle of porting a very large legacy code base from MS Windows to Linux.
After looking at the alternatives, I chose SDL as my cross-platform window manager.
I have SDL 2.0.18 installed, though lately, I’ve been building SDL2 from source directly within my CMake-generated solution.

I have the SDL2 port mostly running under MS Windows, but when running the same code under Fedora Core 35, the OpenGL context failed to be created.
It turns out it was the SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 8) call.
Removing that (leaving it at its default value of 0, as far as I know) got the OpenGL context created.
Now I’m on to other bugs.

But how does this make sense?
My old MS Windows laptop has an nVidia Quadro K3000M, and driver version 382.16, supporting OpenGL 4.5.
My Linux PC has an nVidia GeForce RTX 2060 Rev. A, and driver version 495.46, supporting OpenGL 4.6.
Seems odd that my old MS Windows laptop supports something my much newer Linux PC doesn’t.

The nVidia control panel under Linux reports the desktop color depth is 24, and shows available framebuffer configurations that support 32 bits, but no apparent method to change it.

Does anyone know what the story is here?

It seems to be because your display color depth is 24-bit instead of 32. Wherever you change the screen resolution should also let you change the color depth.

FYI, setting the framebuffer’s alpha size to 0 doesn’t mean you can’t do alpha blending, it just means you can’t have a transparent destination framebuffer.

My Linux PC has an nVidia GeForce RTX 2060 Rev. A, and driver version 495.46

Are you using X11 or Wayland on Linux?
At least with X11 8bit alpha should work - dhewm3 uses it - and for some weird reason on nvidia/X11 actually needs it to look correct (while on Wayland with other drivers that makes the window half-transparent).

Wherever you change the screen resolution should also let you change the color depth.

That’s the problem – I can’t find where to change the color depth. system-config-display apparently was capable of that, but it’s obsolete. If mate-display-properties allows it, I can’t find it. Apparently I can do it the hard way with Xorg -configure. (And yes, I know that this isn’t an SDL question. :slight_smile: )

FYI, setting the framebuffer’s alpha size to 0 doesn’t mean you can’t do alpha blending, it just means you can’t have a transparent destination framebuffer.

That’s good news – I feel more comfortable blowing it off, then.

Are you using X11 or Wayland on Linux?

X11.

Well, that’s the theory - for some reason in dhewm3 alpha blending is broken (in some places at least, like the main menu) when not setting alpha bits…
I have no idea why, I guess I should investigate that eventually (though I don’t even know where to start…) - if anyone is interested, setting r_waylandcompat 1 and then doing vid_restart on X11 reproduces the problem, and it’s not just nvidia-specific, just reproduced it on RPi4.

So maybe alphabits 0 will work great for you, but there’s a possibility that it won’t, in case your application happens to do the same (presumably obscure) thing the dhewm3/Doom3 main menu does, whatever it is.

UPDATE: Ha, in dhewm3 it even happens on Windows (with nvidia driver). Still might not be relevant for you, of course:

Yet the rest of the UI in that screenshot is correct and using alpha blending. Weird.

Ok, I figured it out (see r_waylandcompat should be deprecated in favor of EGL_EXT_present_opaque · Issue #426 · dhewm/dhewm3 · GitHub and following for details).

Basically, the problem is that glBlendFunc(GL_DST_ALPHA, GL_ONE_MINUS_DST_ALPHA); is used for the drawcalls that look wrong (and other blendmodes that don’t involve the destination are used for the rest that’s correctly blended).

So in case you render directly to the context’s default framebuffer (instead of an FBO), you may wanna grep for "GL_[A-Z_]*DST" to see if your code is affected.
Not 100% sure if just GL_(ONE_MINUS_)DST_ALPHA is affected or also the .._DST_COLOR variants - on the one hand, those also use the alpha channel in the calculation, on the other hand only for the alpha channel which is discarded anyway (if the context has no alpha channel).