Question: Blur filters on SDL_Textures?

The situation is I have a raycasted shadow map for a 2D tile based environment, but it casts hard edged shadows. I want to run a simple blur filter on this texture before rendering it, to soften the edges and make the effect more visually appealing.

So far I’ve come up with two plausible methods, and neither really work well.
Method one that I’ve tested is to scale the texture down to a quarter the size or even less, and scaling it up with linear filtering enabled (SDL_HINT_RENDER_SCALE_QUALITY) , and while this works, the blur isn’t quite strong enough for what I want to achieve. It also can’t really be improved because scaling down the texture further makes it extremely blocky when scaled up. I also tried scaling it down and the adding “jitter” to it by additively rendering it four times offset by a pixel each time, at 25% opacity. This improved the result, but again can’t really be taken any further. It can also create quite ugly aliasing artifacts when there are small details on the texture. And it eats performance at higher resolutions…

Method two would be to render the shadow map in software, giving full access to manually write a blur filter, but this requires sending data across the bus every frame, which is hardly optimal, it also seems like a backwards move in todays world of stupendously powerful GPU’s everywhere.

Suggestions? Ideas? Is there a way to blur an SDL texture in hardware? Is there some way to run a GLSL shader on a SDL_Texture? Are there any other techniques that work at run time?

2014-12-31 18:02 GMT-03:00, BadManiac <martin_maniac at hotmail.com>:

Method two would be to render the shadow map in software, giving full access
to manually write a blur filter, but this requires sending data across the
bus every frame, which is hardly optimal, it also seems like a backwards
move in todays world of stupendously powerful GPU’s everywhere.

Sending a single framebuffer once per frame isn’t as bad as it sounds,
especially when you consider the shadow map doesn’t have to have the
same resolution as the screen (you could impose a cap on its size, for
example, or even make it a setting, which can help for people with old
hardware).

That said, as you mention it this is something that would be better
off as a GPU shader, and that’s beyond the renderer API’s
capabilities. You would need to redo everything in pure OpenGL instead
(maybe SDL_gpu can handle this, but even in that case you’ll need to
rewrite the code for the same reason).

If you can’t do that the best you can get is just faking it. (also for
the record, this is an issue even in modern GPUs because you’re
essentially trying to fake antialiasing out of an image without
antialiasing so to get anything even remotely optimal you’ll need to
apply some heuristics)

PS: I’d rather have hard shadows over ugly looking soft shadows, so if
you have issues coming up with something good take that into
account…

Yeah, SDL_gpu can load shaders and send arbitrary uniform/attribute data to
them. Rewriting code to use the SDL_gpu renderer is not bad at all, except
that the documentation isn’t awesome yet. The shader-demo in the source
repository might be a good place to look and I’m always ready to answer
questions.

Jonny DOn Wed, Dec 31, 2014 at 10:41 PM, Sik the hedgehog < sik.the.hedgehog at gmail.com> wrote:

2014-12-31 18:02 GMT-03:00, BadManiac <martin_maniac at hotmail.com>:

Method two would be to render the shadow map in software, giving full
access
to manually write a blur filter, but this requires sending data across
the
bus every frame, which is hardly optimal, it also seems like a backwards
move in todays world of stupendously powerful GPU’s everywhere.

Sending a single framebuffer once per frame isn’t as bad as it sounds,
especially when you consider the shadow map doesn’t have to have the
same resolution as the screen (you could impose a cap on its size, for
example, or even make it a setting, which can help for people with old
hardware).

That said, as you mention it this is something that would be better
off as a GPU shader, and that’s beyond the renderer API’s
capabilities. You would need to redo everything in pure OpenGL instead
(maybe SDL_gpu can handle this, but even in that case you’ll need to
rewrite the code for the same reason).

If you can’t do that the best you can get is just faking it. (also for
the record, this is an issue even in modern GPUs because you’re
essentially trying to fake antialiasing out of an image without
antialiasing so to get anything even remotely optimal you’ll need to
apply some heuristics)

PS: I’d rather have hard shadows over ugly looking soft shadows, so if
you have issues coming up with something good take that into
account…


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

2015-01-01 11:56 GMT-03:00, Jonathan Dearborn :

Rewriting code to use the SDL_gpu renderer is not bad at all, except
that the documentation isn’t awesome yet.

The problem is if there’s a lot of code that needs rewriting, nothing
will help you there :stuck_out_tongue:

Jonny D wrote:

Yeah, SDL_gpu can load shaders and send arbitrary uniform/attribute data to them.?? Rewriting code to use the SDL_gpu renderer is not bad at all, except that the documentation isn’t awesome yet.?? The shader-demo in the source repository might be a good place to look and I’m always ready to answer questions.
Jonny D

Thanks for all your replies, it sounds pretty hopeful. I’ve contacted you about this Jonny, hopefully we can get it figured out. It’s not that I have that much code to rewrite, I’m just gathering techniques BEFORE I start the major rewrite. This is all stuff I was able to do in my old 16bpp software library, but I’m trying to update my tools and knowledge a century or so :slight_smile: