new gpu api and SDL_Render* in same renderer?

Hi! I’m digging into SDL3 now that the gpu api is merged in. I’m escaping Unity after several years of working with it. The gpu api, at first blush, seems pretty nice.

The first SDL example I got working was the basic hello example:
https://github.com/libsdl-org/SDL/blob/main/docs/hello.c

Then I got a triangle rendering by adapting this to the SDL_main functions:
https://github.com/TheSpydog/SDL_gpu_examples/blob/main/Examples/BasicTriangle.c

Not because I have a specific need right now, but because I can see some of the SDL_Render* functions being useful while prototyping, I was trying to get SDL_RenderDebugText working in the BasicTriangle gpu example but if I put SDL_RenderPresent in that example, I get a vulkan error “explicit sync is used, but no acquire point is set”.

My google-fu is failing me, so this is either really easy and I just don’t understand SDL’s rendering stuff enough to piece it together yet, or it’s a pretty atypical use-case.

Is there a straightforward way to use the two api’s together on the same renderer without resorting to e.g. rendering with the gpu to a texture then rendering that texture using SDL_RenderTexture or something like that?

Thanks!

You can use both together, but you have to be careful.

When you create the renderer you must explicitly choose the “gpu” backend.

If you’re using the renderer, before you do your own drawing with SDL_GPU, you need to call SDL_FlushRenderer().

At some point it becomes easier to just ditch SDL_Renderer and use SDL_GPU for even 2D stuff. Implementing sprite batching etc with quads is pretty straightforward.

1 Like

thanks for the response @sjr !

Do you know of an example of them working together? After I add SDL_SetHint(SDL_HINT_RENDER_DRIVER, “gpu”) the call to SDL_SetGPUSwapchainParameters crashes.

More generally, I can’t seem to get a gpu device and swapchain set up when using the gpu render driver.

Cheers

So, a couple of things:

  1. The renderer will create its own device, swapchain etc. If you want to create the swapchain yourself (specific format, etc.) then there’s probably no way to also use the renderer and be sure it will all play nice forever.
  2. I would probably specify the “gpu” backend specifically when creating the renderer, rather than via a hint. If you’re using SDL 3.4 (which isn’t out yet, but should be Real Soon Now) you can also use the function SDL_CreateGPURenderer(), which has the advantage of letting you specify which platforms you’re going to be supplying shaders for (SDL 3.4’s 2D render lets you supply shaders for use within the renderer, but since this sets up the underlying SDL_GPU driver to accept those formats it also applies if you’re mixing the renderer and SDL_GPU)

You can access the renderer’s underlying SDL_GPUDevice by doing:

SDL_PropertiesID renderProps = SDL_GetRendererProperties(myRenderer);
SDL_GPUDevice *renderDevice = SDL_GetPointerProperty(renderProps, SDL_PROP_RENDERER_GPU_DEVICE_POINTER, NULL);

(you should probably do this once after creating the renderer and then stash the GPU device somewhere, instead of asking for it every frame)

So, the general flow would be like

  1. Create the window
  2. Create the renderer
  3. Get GPU device from renderer
  4. Each frame:
    4.1) Call SDL_RenderClear(), so it knows there’s a new frame
    4.2) Call SDL_FlushRenderer()
    4.3) Create your own command buffer, your own passes, draw your stuff with SDL_GPU, then submit your command buffer
    4.4) Draw whatever you’re gonna draw with SDL_Renderer
    4.5) Call SDL_RenderPresent()

However, this all seems kinda fragile IMHO, and I personally wouldn’t want to be trying to use SDL_GPU within the confines of SDL_Renderer.

One of the developers of SDL_GPU posted how to create a 2D sprite batcher in SD_GPU relatively easily (ignore the part at the top about not having custom shaders in SDL_Renderer; as of SDL 3.4 it does support them)

edit: I don’t see a way to get the swapchain texture from SDL_Renderer to create your own render passes.

1 Like

If anyone knows if there’s a SDL renderer → gpu guide I’d look at it, but I don’t think I’d ever have a reason to use gpu for any of my codebases

If you’re trying to mix SDL_Renderer and SDL_GPU (or any other 3D API) at some point it becomes easier to just ditch SDL_Renderer and do the 2D stuff yourself.

Anyway, unless I’m missing something, there doesn’t seem to be a way to mix SDL_Renderer and SDL_GPU. @cosmonaut @TheSpydog any thoughts / ideas?