[Shader] Smooth pixel filtering

Hi everyone !

I’ve just ported a SDL1.2 game using software surfaces to SDL2, and exclusively use SDL_Texture to have a huge speedup.
The last point is that the game offer the possibility to choose a smooth display filter (Scale2x and HQ4x). It was previously done with pixel manipulations since it was not a costly operation with SDL_Surface.

Now that we use SDL_Texture, I’ve heard that the only way to reliably do pixel manipulations is to use GLSL pixel shader.
I’m pretty new with modern display features, so I’ll ask some question before eating the documentation :slight_smile:

  • Is it possible to apply a GL pixel shader on the SDL_Renderer directly, since I just want to modify the final result ?

  • If no, is there a way to apply it on each textures or should I let the possibility to still use SDL_Surface with all smoothed modes ?

  • Last question, but it’s not related to SDL so I might search by myself : if apply a shader on the renderer is possible, is that gonna work on mobile platform using GL ES too ?

Thanks to all readers, and I apologize for my bad English ! :wink:

First of all: If you’re going to use GLSL, it will only work with OpenGL
SDL_Renderers. So, no Direct3D. I’m unsure about OpenGL:ES.
Secondly: Just use an SDL_Surface if you’re going to do A LOT of
manipulations on the CPU. Alternatively, use SDL_Texture (with streaming, I
think) and eat the little performance loss.
Thirdly: You can tell the GPU to scale the texture for you by giving
SDL_RenderCopy a destination rectangle that is bigger(or smaller) than the
original image. I believe you can ask SDL to give it a hint on how you want
the renderer to scale it, but that’s just about it: a kind suggestion that
it isn’t obligated to follow.On Tue, Nov 26, 2013 at 3:10 AM, vlag <valentin.soudier at gmail.com> wrote:

Hi everyone !

I’ve just ported a SDL1.2 game using software surfaces to SDL2, and
exclusively use SDL_Texture to have a huge speedup.
The last point is that the game offer the possibility to choose a smooth
display filter (Scale2x and HQ4x). It was previously done with pixel
manipulations since it was not a costly operation with SDL_Surface.

Now that we use SDL_Texture, I’ve heard that the only way to reliably do
pixel manipulations is to use GLSL pixel shader.
I’m pretty new with modern display features, so I’ll ask some question
before eating the documentation [image: Smile]

  • Is it possible to apply a GL pixel shader on the SDL_Renderer directly,
    since I just want to modify the final result ?

  • If no, is there a way to apply it on each textures or should I let the
    possibility to still use SDL_Surface with all smoothed modes ?

  • Last question, but it’s not related to SDL so I might search by myself :
    if apply a shader on the renderer is possible, is that gonna work on mobile
    platform using GL ES too ?

Thanks to all readers, and I apologize for my bad English ! [image: Wink]


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

  1. Ok, since the portability of the app is more important than performances, I’m ok to drop the Direct3D support ( but not the OpenGL ES so I’ll investiguate ).

  2. The only pixel manipulations are these smooth pixel filters, so if I can apply a shader directly on the renderer, I guess using static SDL_Texture should be the best choice.

  3. Tell the GPU to do bilinear interpolation will works for the Scale2x filter, since it’s pretty much the same result, but not for the HQ4x one which is a little more elaborate ( http://www.hiend3d.com/hq4x.html ). Except if it is possible to set a custom hint to scale textures ?

By the way, thanks for your answer :slight_smile:

  1. Ok, since the portability of the app is more important than
    performances, I’m ok to drop the Direct3D support ( but not the OpenGL ES
    so I’ll investiguate ).

  2. The only pixel manipulations are these smooth pixel filters, so if I
    can apply a shader directly on the renderer, I guess using static
    SDL_Texture should be the best choice.

  3. Tell the GPU to do bilinear interpolation will works for the Scale2x
    filter, since it’s pretty much the same result, but not for the HQ4x one
    which is a little more elaborate ( http://www.hiend3d.com/hq4x.html ).
    Except if it is possible to set a custom hint to scale textures ?

If you only want bilinear interpolation just use:

    SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, "1");

… before creating your texture.

That texture will be blitted with bilinear interpolation, with direct3d the
system can do also better with “2” (that’s anisotropic, it’s the same as
HQ4x?), read the doc here anyway:

http://wiki.libsdl.org/SDL_HINT_RENDER_SCALE_QUALITY

AFAIR this setting is “per texture”.On Tue, Nov 26, 2013 at 4:38 PM, vlag <valentin.soudier at gmail.com> wrote:


Bye,
Gabry

I don’t want to use the native bilinear interpolation, because we only have Scale2x and HQ4x filter for now, but we surely add some more later.

To elaborate a bit, what I really want to do is to still use SDL_Texture, write an abstraction class which will compile some arbitrary shaders (not known at the compilation time), and “apply” one of these shader on the renderer at each SDL_RenderPresent (.

Is there some kind of hack (or a proper way) to do this ?

2013/11/26 vlag <valentin.soudier at gmail.com>

I don’t want to use the native bilinear interpolation, because we only
have Scale2x and HQ4x filter for now, but we surely add some more later.

To elaborate a bit, what I really want to do is to still use SDL_Texture,
write an abstraction class which will compile some arbitrary shaders (not
known at the compilation time), and “apply” one of these shader on the
renderer at each SDL_RenderPresent (.

Is there some kind of hack (or a proper way) to do this ?

I think you won’t get around using OpenGL directly. SDL2 doesn’t expose
GPU shaders (for obvious reasons).

Jonas

SDL2 doesn’t expose GPU shaders because they’re dependent on the renderer
(OpenGL uses GLSL, Direct3D uses HLSL). However, one could probably make
their own shader language for SDL which will then be converted to the
shader relevant to the renderer, just like SDL_Texture.On Tue, Nov 26, 2013 at 6:48 PM, Jonas Kulla wrote:

2013/11/26 vlag <valentin.soudier at gmail.com>

I don’t want to use the native bilinear interpolation, because we only
have Scale2x and HQ4x filter for now, but we surely add some more later.

To elaborate a bit, what I really want to do is to still use SDL_Texture,
write an abstraction class which will compile some arbitrary shaders (not
known at the compilation time), and “apply” one of these shader on the
renderer at each SDL_RenderPresent (.

Is there some kind of hack (or a proper way) to do this ?

I think you won’t get around using OpenGL directly. SDL2 doesn’t expose
GPU shaders (for obvious reasons).

Jonas


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

That would require building a full-fledged compiler into SDL, with a defined grammar, a lexer, a parser, and multiple code generator backends.? I’m not sure anything like that would be within the scope of SDL.

SDL2 doesn’t expose GPU shaders because they’re dependent on the renderer (OpenGL uses GLSL, Direct3D uses HLSL). However, one could probably make their own shader language for SDL which will then be converted to the shader relevant to the renderer, just like SDL_Texture.

2013/11/26 vlag <valentin.soudier at gmail.com>On Tuesday, November 26, 2013 2:48 PM, Ivan Rubinson wrote:
On Tue, Nov 26, 2013 at 6:48 PM, Jonas Kulla wrote:

I don’t want to use the native bilinear interpolation, because we only have Scale2x and HQ4x filter for now, but we surely add some more later.

To elaborate a bit, what I really want to do is to still use SDL_Texture, write an abstraction class which will compile some arbitrary shaders (not known at the compilation time), and “apply” one of these shader on the renderer at each SDL_RenderPresent (.

Is there some kind of hack (or a proper way) to do this ?

I think you won’t get around using OpenGL directly. SDL2 doesn’t expose
GPU shaders (for obvious reasons).

Jonas?


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Which is exactly why SDL doesn’t have that.On Wed, Nov 27, 2013 at 2:12 AM, Mason Wheeler wrote:

That would require building a full-fledged compiler into SDL, with a
defined grammar, a lexer, a parser, and multiple code generator backends.
I’m not sure anything like that would be within the scope of SDL.

On Tuesday, November 26, 2013 2:48 PM, Ivan Rubinson <@Ivan_Rubinson> wrote:
SDL2 doesn’t expose GPU shaders because they’re dependent on the
renderer (OpenGL uses GLSL, Direct3D uses HLSL). However, one could
probably make their own shader language for SDL which will then be
converted to the shader relevant to the renderer, just like SDL_Texture.

On Tue, Nov 26, 2013 at 6:48 PM, Jonas Kulla wrote:

2013/11/26 vlag <valentin.soudier at gmail.com>

I don’t want to use the native bilinear interpolation, because we only
have Scale2x and HQ4x filter for now, but we surely add some more later.

To elaborate a bit, what I really want to do is to still use SDL_Texture,
write an abstraction class which will compile some arbitrary shaders (not
known at the compilation time), and “apply” one of these shader on the
renderer at each SDL_RenderPresent (.

Is there some kind of hack (or a proper way) to do this ?

I think you won’t get around using OpenGL directly. SDL2 doesn’t expose
GPU shaders (for obvious reasons).

Jonas


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Message-ID: Content-Type: text/plain; charset="iso-8859-1"

Which is exactly why SDL doesn’t have that.

Yeah, this is definitely extension library territory.> Date: Wed, 27 Nov 2013 05:04:09 +0200

From: Ivan Rubinson
To: Mason Wheeler , SDL Development List
Subject: Re: [SDL] [Shader] Smooth pixel filtering

On Wed, Nov 27, 2013 at 2:12 AM, Mason Wheeler wrote:

That would require building a full-fledged compiler into SDL, with a
defined grammar, a lexer, a parser, and multiple code generator backends.
I’m not sure anything like that would be within the scope of SDL.

On Tuesday, November 26, 2013 2:48 PM, Ivan Rubinson wrote:
SDL2 doesn’t expose GPU shaders because they’re dependent on the
renderer (OpenGL uses GLSL, Direct3D uses HLSL). However, one could
probably make their own shader language for SDL which will then be
converted to the shader relevant to the renderer, just like SDL_Texture.

On Tue, Nov 26, 2013 at 6:48 PM, Jonas Kulla wrote:

2013/11/26 vlag <valentin.soudier at gmail.com>

I don’t want to use the native bilinear interpolation, because we only
have Scale2x and HQ4x filter for now, but we surely add some more later.

To elaborate a bit, what I really want to do is to still use SDL_Texture,
write an abstraction class which will compile some arbitrary shaders (not
known at the compilation time), and “apply” one of these shader on the
renderer at each SDL_RenderPresent (.

Is there some kind of hack (or a proper way) to do this ?

The proper way to do it is to write an extension library for SDL2. No
one’s done it yet, because it’s the sort of thing that scares people
off, but it should be quite possible. I’d suggest looking at the
parser stage of TCC: I’ve thought about how it would be a nice library
to have before, and always come to the conclusion that using the TCC
parser as a starting point would probably be the fastest route to the
goal. If you actually go this route, then I’d suggest that you start
by renaming the parser functions, so that someone else can later use
TCC itself for a software backend.

Just remember before you jump into such a project: compiler design is
not always the most straight-forward of processes, so do the simple
stuff first.

No one has done it because at that point you’d be using either direct
calls to the video API or a premade full-blown engine from elsewhere.
SDL2 only provides the basics (and SDL1 didn’t even provide that).

Just do the “shaders” on the CPU.On Wed, Nov 27, 2013 at 7:41 AM, Sik the hedgehog < sik.the.hedgehog at gmail.com> wrote:

No one has done it because at that point you’d be using either direct
calls to the video API or a premade full-blown engine from elsewhere.
SDL2 only provides the basics (and SDL1 didn’t even provide that).


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Hm ok, write an abstraction class for shading language is a little too much for what I want, since I’m ok to drop the Direct3D support and use only GLSL. I’m even ok to force the OpenGL ES2 driver if it’s the only one supported on the 5 main platforms.
So let’s say I just want to write a class that will compile a GLSL shader and apply it on the renderer, the whole thing is to know if it’s possible before starting any code writing.

Just do the “shaders” on the CPU.

That mean don’t using SDL_Texture and revert to SDL_Surface. This is the easiest way but definitively not the faster one (maybe the only way for what I want ? ).

Anyway, thanks for you help ! I really appreciate :slight_smile:

If you need shaders, my suggestion would be to go for OpenGL or OpenGL ES
directly. If you are not familiar with the APIs, you will face a higher
learning curve in the beginning but you’ll gain flexibility and performance
in the long run.

The trouble with shoving shaders in a shader-unaware abstraction is that it
will cause trouble sooner or later. SDL may be using shaders internally (it
has to, for OpenGL 3+ and OpenGL ES 2+) and substituting these for your own
is just asking for trouble.

2013/11/27 vlag <valentin.soudier at gmail.com>> Hm ok, write an abstraction class for shading language is a little too

much for what I want, since I’m ok to drop the Direct3D support and use
only GLSL. I’m even ok to force the OpenGL ES2 driver if it’s the only one
supported on the 5 main platforms.
So let’s say I just want to write a class that will compile a GLSL shader
and apply it on the renderer, the whole thing is to know if it’s possible
before starting any code writing.

Quote:

Just do the “shaders” on the CPU.

That mean don’t using SDL_Texture and revert to SDL_Surface. This is the
easiest way but definitively not the faster one (maybe the only way for
what I want ? ).

Anyway, thanks for you help ! I really appreciate [image: Smile]


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

It is.? My code does it.? You just have to make sure that you preserve SDL’s internal state, which means you query the active shader program (and anything else you change in the GL state machine) and then restore it once you’re done.

MasonOn Wednesday, November 27, 2013 1:06 AM, vlag <valentin.soudier at gmail.com> wrote:

Hm ok, write an abstraction class for shading language is a little too much for what I want, since I’m ok to drop the Direct3D support and use only GLSL. I’m even ok to force the OpenGL ES2 driver if it’s the only one supported on the 5 main platforms.
So let’s say I just want to write a class that will compile a GLSL shader and apply it on the renderer, the whole thing is to know if it’s possible before starting any code writing.

SDL shouldn’t probably get it either. I think we can mostly agree
that’s beyond SDL’s scope.

Given how SDL’s 2D renderer gets used, though, including a couple of
shaders in the renderer drivers that support them for a few major mag
filters wouldn’t go unappreciated. Bicubic and either hq4x, 4xBRZ,
or both, are the obvious choices. Obviously if you’re using an
OpenGL that doesn’t support shaders, you’re going to get your choice
of GL_NEAREST and GL_LINEAR. :slight_smile:

You should be able to do bicubic even on DirectX 8 class hardware
even with pre-2.0 pixel shaders. That’d give you GF4-era hardware
support, and anything shipped with Windows post-XP is gonna have
better than that. Even Intel GMA will do the job, and that’ll take
you back about 10 years now.

Actually Intel GMA and just about anything with DirectX 9 class
should do hq4x and 4xBRZ just fine I suspect.

OpenGL ES v2 would likely require a slightly different shader than
one for OpenGL desktop written for maximum compatibility. Some
OpenGL ES v1 APIs I guess expose shader functions because they’re
actually OpenGL ES v2 hardware?but I don’t see major reason to bother
with it for ES v1 since support would be spotty, you’d have v2
anyway, and anything using v1 natively probably has a 240x320 or
smaller screen anyway. Not much point in going out of your way. :slight_smile:

JosephOn Wed, Nov 27, 2013 at 05:04:09AM +0200, Ivan Rubinson wrote:

Which is exactly why SDL doesn’t have that.

On Wed, Nov 27, 2013 at 2:12 AM, Mason Wheeler wrote:

That would require building a full-fledged compiler into SDL, with a
defined grammar, a lexer, a parser, and multiple code generator backends.
I’m not sure anything like that would be within the scope of SDL.

On Tuesday, November 26, 2013 2:48 PM, Ivan Rubinson wrote:
SDL2 doesn’t expose GPU shaders because they’re dependent on the
renderer (OpenGL uses GLSL, Direct3D uses HLSL). However, one could
probably make their own shader language for SDL which will then be
converted to the shader relevant to the renderer, just like SDL_Texture.

On Tue, Nov 26, 2013 at 6:48 PM, Jonas Kulla wrote:

2013/11/26 vlag <valentin.soudier at gmail.com>

I don’t want to use the native bilinear interpolation, because we only
have Scale2x and HQ4x filter for now, but we surely add some more later.

To elaborate a bit, what I really want to do is to still use SDL_Texture,
write an abstraction class which will compile some arbitrary shaders (not
known at the compilation time), and “apply” one of these shader on the
renderer at each SDL_RenderPresent (.

Is there some kind of hack (or a proper way) to do this ?

I think you won’t get around using OpenGL directly. SDL2 doesn’t expose
GPU shaders (for obvious reasons).

Jonas


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

It is. My code does it. You just have to make sure that you preserve SDL’s internal state, which means you query the active shader program (and anything else you change in the GL state machine) and then restore it once you’re done.

Just to be sure, by “My code does it”, you mean your code uses a GLSL shader when rendering on window, or that it apply a shader with SDL_Texture/Gl textures copying ?
Anyway, I’d be happy to see the piece of code if you agree, it can learn me a lot on the way to use shaders, in addition with the official SDL exemple :wink:

@Joseph Carter
Ok so modern (or not) GPUs should be able to do some simple (or not :wink: ) pixel filtering using shaders, but my first question was mostly on software side, basically where is the limit of using GLSL shaders with SDL :slight_smile:

Thanks again.

There’s no limit to GLSL shaders with SDL if you know what you’re doing.On Mon, Dec 2, 2013 at 4:02 AM, vlag <valentin.soudier at gmail.com> wrote:

Quote:

It is. My code does it. You just have to make sure that you preserve
SDL’s internal state, which means you query the active shader program (and
anything else you change in the GL state machine) and then restore it once
you’re done.

Just to be sure, by “My code does it”, you mean your code uses a GLSL
shader when rendering on window, or that it apply a shader with
SDL_Texture/Gl textures copying ?
Anyway, I’d be happy to see the piece of code if you agree, it can learn
me a lot on the way to use shaders, in addition with the official SDL
exemple [image: Wink]

@Joseph Carter
Ok so modern (or not) GPUs should be able to do some simple (or not [image:
Wink] ) pixel filtering using shaders, but my first question was mostly
on software side, basically where is the limit of using GLSL shaders with
SDL [image: Smile]

Thanks again.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

My code applies shaders while rendering both textures and entire render targets to the current window. :slight_smile:

You can find the entire codebase at http://code.google.com/p/turbu/source/? It’s in Delphi.? Run a search for “dm_shaders” for units that use the shader code.? But the basic rule is, any code that does rendering outside of SDL’s system has to restore any OpenGL state that it modifies before going back to SDL.

In the shader code, that means doing something like this:

procedure RenderWithShaders;
var
?? currentProgram: GLint;
begin
?? glGetIntegerv(GL_CURRENT_PROGRAM, @currentProgram);
?? try
??? //do rendering here
?? finally
??? glUseProgram(currentProgram);
?? end;
end;

There’s no limit to GLSL shaders with SDL if you know what you’re doing.On Sunday, December 1, 2013 7:56 PM, Ivan Rubinson wrote:

On Mon, Dec 2, 2013 at 4:02 AM, vlag <valentin.soudier at gmail.com> wrote:

Quote:

It is. My code does it. You just have to make sure that you preserve SDL’s internal state, which means you query the active shader program (and anything else you change in the GL state machine) and then restore it once you’re done.

Just to be sure, by “My code does it”, you mean your code uses a GLSL shader when rendering on window, or that it apply a shader with SDL_Texture/Gl textures copying ?
Anyway, I’d be happy to see the piece of code if you agree, it can learn me a lot on the way to use shaders, in addition with the official SDL exemple

@Joseph Carter
Ok so modern (or not) GPUs should be able to do some simple (or not ) pixel filtering using shaders, but my first question was mostly on software side, basically where is the limit of using GLSL shaders with SDL

Thanks again.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Yeah, thanks for the code sample, it’s really helpful :slight_smile:

By the way, I tried to work with shader the last few days, and I’ve not reached my goal to apply a shader when rendering.
It work fine when I copy a Gl texture or a SDL_Texture to the renderer, but not if I try to apply the shader just before swapping the window by calling SDL_RenderPresent() (and restore it after).
It also don’t work if I try with the SDL official shader exemple, where we just use the OpenGL API.

Do you confirm that it is really possible to apply a shader when rendering, or should I apply and restore the shader at each SDL/OpenGL draw to the renderer?