I asked this on the Odin forum, but still haven’t found a solution, see Enabling floating point z-buffers in SDL3/OpenGL - Help - Odin - Forum but thank you to Barinzaya for trying.
I’m currently trying to expand the OpenGL example code into something a bit more like a real game, and learning OpenGL and Odin along the way. I thought I’d switch to a floating point z-buffer, as that just seems way easier to work with than integer ones, but I have gotten lost in all the incantations that don’t work. So far I have a 16 bit integer z-buffer when rendering to the back buffer, and a 24 bit integer z-buffer attached to a buffer object, and it seems to stay that way no matter what flags I pass around.
I have tried setting the global attributes GL_FLOATBUFFERS and GL_DEPTH_SIZE, as those seem like they might be relevant, but it doesn’t seem like they change from defaults no matter when I set them.
I’m on Windows 10, which might matter according to 15 year old SO answers.
Code is here: Some buggy test code · GitHub
I have set it up to produce a clearly visible z error that should be fixed with a floating point z-buffer, the two triangles ought to intersect at the point where they change colour.
If you are not familiar with Odin I don’t think the code should be too hard to follow. If you want to run it that should be fairly simple, install the Odin compiler, add it to your path, put the source file in its own folder along with the sdl3 dll and compile and run it with odin run .