Hello, I’ve been trying for days to get DrawGPUIndexedPrimitivesIndirect to work.
I’m making a basic 3D particle system, and I want each particle to render a quad, all of them instanced in a single draw call. I’m pretty new to graphics so I supposed this function is what I need? But I can’t find anywhere how to use it precisely and what each of the parameters mean exactly.
May someone provide a minimal example? Thanks a lot
My understanding is that indirect drawing is where the GPU itself (in a compute shader) builds a command buffer, which the CPU then tells it to execute later.
What you’re looking for is instanced drawing (which you can do with either SDL_GPUDrawPrimitives() or SDL_GPUDrawIndexedPrimitives()). However, for a particle system where only quads are being drawn, instanced drawing is probably overkill. You can just fill a vertex buffer with the quads and draw them all in one draw call. Or use point primitives.
edit: instanced rendering would require you to maintain (and upload to the GPU) a buffer with a separate translation matrix for each particle. Definitely not optimal.
What I would do: if you don’t need the particles to appear rotated, use point primitives (set the primitive type to SDL_GPU_PRIMITIVETYPE_POINTLIST in your graphics pipeline create info struct) and then output an appropriate point size in your vertex shader. No need to fill a vertex buffer with quads, just the center point of each particle and whatever other information you need for the shader.
If you do need them to appear rotated then just fill a vertex buffer with quads (you could do this in a compute shader if speed became a problem).
edit 2: I should point out (pun very much intended) that point primitives don’t appear as points on the screen. Your vertex shader does the same math on the point that it would do on a vertex in a model, but in addition to the position on screen it also outputs the size of the point (which can vary from point to point). The GPU then renders a quad in its place, centered on the point, using a fragment shader like it would for “regular” geometry (including texture coordinates).
The downsides to point primitives are:
Clipping is done based on the point primitive’s position on screen, not the quad’s. For points being rendered at very small sizes this doesn’t really matter. But consider a point with a large pointsize moving off screen: the quad that gets rendered will disappear when it’s only halfway off screen, because it was centered on the point, which got clipped and thus no further processing was done.
The quad that gets rendered will always be square with the screen’s X and Y axes. If your particle system rotates the particles, you’ll have to get creative. Maybe pass the particle’s rotation to the fragment shader and have it rotate the texture coordinates before sampling the texture.