I’ve been working on a software rasterizer. Long story short, I do all my drawing to a raw pixel array (uint32_t*), put that into an SDL_Texture and render it on the window. It used to work with SDL2, but now that I ported it to SDL3 my software just shows a black window, and I even have different window dimensions. Photo is SDL3, while the first comment on the post is a photo of SDL2.
I disabled the UI I had on the left of the screen, but the wood cube should still appear on the screen. I believe the new size is the correct one, but I’d still appreciate knowing why it changed.
I’ve read the migration guide and tried to debug myself, but I’m out of ideas. Here is the code diff (relevant part is the main.c file and maybe the tasks.json file if the problem happened during compilation).
One thing I noticed is that in the SDL 2 version you’re creating the texture with SDL_TEXTUREACCESS_STREAMING but in the SDL 3 version you’re using SDL_TEXTUREACCESS_TARGET
Another thing is that SDL 3 functions return a bool to let you know if they succeeded or not, yet you’re using an int to store the result. Furthermore, you’re checking for -1 for failure, which is incorrect. For SDL 3 functions that return a bool, it’s set to true on success and false on failure.
Thanks for the help! Your suggestions make sense, but even after changing them, I still get the same result. You can see the new code on the link of the original post. Any idea why the size of the window would be different?
No idea why the window size would be different. I always create the window and renderer separately, but I don’t think using SDL_CreateWindowAndRenderer() is the problem. I don’t see WIDTH and HEIGHT in the diff, so it doesn’t look like they’ve been changed accidentally.
Also: this is all being done on the main thread, right?
Yes, this is all in the same thread and w/h were not changed. I wonder if the sizing was saved in disk after a resize. Is that possible? Does SDL do that?
In test.c, your pixels array should be unsigned, use uint32_t instead of int32_t
As to the pixel format, be aware that SDL kinda does the opposite of what a lot of people expect. SDL_PIXELFORMAT_RGBA8888 does not mean the channels are stored in memory in RGBA order. Rather, the “logical” order is RGBA (as in, if you treat the whole pixel as one uint32_t). On a little-endian CPU (aka 99% of CPUs, including x86 and ARM) the memory order for SDL_PIXELFORMAT_RGBA8888 will be ABGR, and only on big-endian machines (PowerPC and MIPS IIRC) will it actually be RGBA.
You can get around this by using SDL_PIXELFORMAT_RGBA32 which will map to a pixel format with an in-memory RGBA order (SDL_PIXELFORMAT_ABGR8888 on little-endian machines).
So instead of clearing with hard-coded hex values, use something like
const SDL_PixelFormatDetails *formatDetails = SDL_GetPixelFormatDetails(SDL_PIXELFORMAT_RGBA32);
uint32_t clearColor = SDL_MapRGBA(formatDetails, 0, 0, 0, 255);
SDL_memset4(pixels, clearColor, WIDTH * HEIGHT); // we are also eliminating the for-loop
I don’t think so. And the whole thing with SDL_PIXELFORMAT_RGBA8888 being ABGR memory order on little-endian systems has been there as long as I can remember.