What happened to SDL_SetTextureScaleMode?

I finally finished porting my little game to SDL 1.3. The last point
on the list had been scaling to desktop resolution, which I
implemented today.

Since we have a fairly low-res game and screens tend to get bigger and
bigger (and wider too), I’ve implemented a variable sized view that
gets scaled to actual screen size. View size is chosen within certain
bounds so that a full integer scaling is possible. That means a pixel
might get blown up to 2x2 or 3x3 pixels in the native resolution of
your LCD screen and you’ll get a blocky but crisp image.

Now, since I went through the effort to have most of the rendering
done on the GPU, I can’t really add software scaling for each frame to
the pipeline. Instead I need to use what scaling comes with SDL.
Unfortunately, the OpenGL renderer uses GL_LINEAR for the
GL_TEXTURE_MAG_FILTER, which reduces the output to a smeared shadow of
its former self ;-). I’ll assume it’s doing something similar with the
D3D and the other accelerated renderers. A quick test with setting the
filter to GL_NEAREST yielded the result I would like to see.

However, with SDL_SetTextureScaleMode gone, I don’t really see a way
of achieving this in a supported way. Now, I can imagine that this
wasn’t working well or not for all renderers or something and
therefore was dropped. The question is, can we have it back? Maybe not
as a per texture flag but as a Hint that would be honoured when
creating/blitting the textures? (I’ve noticed that D3D does it in
D3D_RenderCopy whereas OpenGL does it in GL_CreateTexture, so there
might be a difference in behaviour),

Any opinions on that?

Kai

P.S. Since I’d love to have this so badly, I might come up with a
patch if there is interest. However, I’m not into graphics programming
much. Never used any OpenGL or D3D directly. No need, since there’s
SDL ;-). So if such a feature is already planned down the road, I’ll
better wait.

My opinion is that setting GL_TEXTURE_MAG_FILTER to GL_LINEAR looks
horrible and should never be used. GL_NEAREST always looks better. Not
only should SDL not use GL_LINEAR by default, but SDL should not even
give the user the option of using GL_LINEAR. If smooth texture scaling
is desired, scale down, not up.On 2/18/2011 16:43, Kai Sterker wrote:

Now, since I went through the effort to have most of the rendering
done on the GPU, I can’t really add software scaling for each frame to
the pipeline. Instead I need to use what scaling comes with SDL.
Unfortunately, the OpenGL renderer uses GL_LINEAR for the
GL_TEXTURE_MAG_FILTER, which reduces the output to a smeared shadow of
its former self ;-). I’ll assume it’s doing something similar with the
D3D and the other accelerated renderers. A quick test with setting the
filter to GL_NEAREST yielded the result I would like to see.

Any opinions on that?


Rainer Deyke - rainerd at eldwood.com

I’m probably going to add a hint back in for it, but I’m talking to a few
people to see what the best default is.

Thanks!On Fri, Feb 18, 2011 at 3:43 PM, Kai Sterker <kai.sterker at gmail.com> wrote:

I finally finished porting my little game to SDL 1.3. The last point
on the list had been scaling to desktop resolution, which I
implemented today.

Since we have a fairly low-res game and screens tend to get bigger and
bigger (and wider too), I’ve implemented a variable sized view that
gets scaled to actual screen size. View size is chosen within certain
bounds so that a full integer scaling is possible. That means a pixel
might get blown up to 2x2 or 3x3 pixels in the native resolution of
your LCD screen and you’ll get a blocky but crisp image.

Now, since I went through the effort to have most of the rendering
done on the GPU, I can’t really add software scaling for each frame to
the pipeline. Instead I need to use what scaling comes with SDL.
Unfortunately, the OpenGL renderer uses GL_LINEAR for the
GL_TEXTURE_MAG_FILTER, which reduces the output to a smeared shadow of
its former self ;-). I’ll assume it’s doing something similar with the
D3D and the other accelerated renderers. A quick test with setting the
filter to GL_NEAREST yielded the result I would like to see.

However, with SDL_SetTextureScaleMode gone, I don’t really see a way
of achieving this in a supported way. Now, I can imagine that this
wasn’t working well or not for all renderers or something and
therefore was dropped. The question is, can we have it back? Maybe not
as a per texture flag but as a Hint that would be honoured when
creating/blitting the textures? (I’ve noticed that D3D does it in
D3D_RenderCopy whereas OpenGL does it in GL_CreateTexture, so there
might be a difference in behaviour),

Any opinions on that?

Kai

P.S. Since I’d love to have this so badly, I might come up with a
patch if there is interest. However, I’m not into graphics programming
much. Never used any OpenGL or D3D directly. No need, since there’s
SDL ;-). So if such a feature is already planned down the road, I’ll
better wait.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


-Sam Lantinga, Founder and CEO, Galaxy Gameworks

I’m probably going to add a hint back in for it, but I’m talking to a few
people to see what the best default is.

Great.

In the meantime, I’ve done some measurements and found the time
invested in the native SDL 1.3 implementation was well spent. Since
our framerate is capped anyway, I measured the time required to
complete one cycle (game logic + rendering). On my desktop, 2.5GHz
Core 2 Duo, NVidia 9600GT with the propriatary drivers under Ubuntu
10.10, 64bit it takes roughly 16ms per frame with SDL 1.2. With 1.3
and the OpenGL renderer that’s down to 0.9ms!

I hope that next week I’ll be able to make the same test on my old
iBook, where right now a cycle with an older revision of SDL 1.3 in
1.2 compatibility mode takes ~170ms.

So thanks a lot!

KaiOn Sun, Feb 20, 2011 at 1:46 AM, Sam Lantinga wrote:

I’m probably going to add a hint back in for it, but I’m talking to a few people to see what the best default is.

This change also breaks compatibility in our software. We are using both nearest filtering for some of UI elements and linear filtering for video frames, and we need both. So I opt to leave it as it was, at least to leave SDL_SetTextureScaleMode, as removing it does not have any impact on performance, but reduces flexibility of SDL.

Cheers,–
Adam Strzelecki

The hint is in: SDL_HINT_RENDER_SCALE_QUALITYOn Thu, Mar 31, 2011 at 8:34 AM, Adam Strzelecki wrote:

I’m probably going to add a hint back in for it, but I’m talking to a few
people to see what the best default is.

This change also breaks compatibility in our software. We are using both
nearest filtering for some of UI elements and linear filtering for video
frames, and we need both. So I opt to leave it as it was, at least to leave
SDL_SetTextureScaleMode, as removing it does not have any impact on
performance, but reduces flexibility of SDL.

Cheers,

Adam Strzelecki


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


-Sam Lantinga, Founder and CEO, Galaxy Gameworks

The hint is in: SDL_HINT_RENDER_SCALE_QUALITY

Thanks a lot, that saves me away from patching SDL.

Cheers,–
Adam Strzelecki

Kai,

Where is the game? I like the sound of how much work went in to it.