Sprite sheet overlap when using hardware textures

I’m using the SDL2 2D graphics API, with SDL_Texture’s. When in hardware mode (tested with direct3d and opengl), but NOT when in software mode, I have a problem:

I’m making a game with a tiny resolution (256x144). I have two fonts. Each font has one image (.png) that contains all of its characters. There are 256 characters. The standard font’s characters are 8x8, so the image is 2048x8. The small font’s characters are 3x5, so the image is 768x5.
I use ‘SDL_RenderSetLogicalSize(renderer,SCREEN_WIDTH,SCREEN_HEIGHT);’ to set the logical render size to 256x144. I make the window a larger resolution (my tests are with the same aspect ratio, though).
On certain resolutions (640x360, 1920x1080) my small font (but not my standard one) is messed up. It’s fine at 256x144 obviously, and it also looks fine at 1280x720.
The weird look appears to be that at certain positions in the window, small font characters will show parts of other characters that are adjacent to them on the sprite sheet image itself.
I remembered something from the ancient past about non-power of 2 textures, so I tried padding my small font image, but that didn’t seem to make a difference. I’m using “nearest” render scale quality.
It seems to be some kind of rounding issue (move a messed up character over 1 logical pixel, and suddenly it looks fine). The best possible thing I’ve found from Googling is to try putting empty padding between sprites on the spritesheet, but that is a terribly painful solution, and I’m hoping there is something I’m missing.
Can anyone tell what my problem is?

Yeah, that sounds like a rounding issue. No idea whether it’s within
SDL or your hardware’s fault (since I know some low-end GPUs are
rather lackluster when it comes to it).

Sik wrote:

Yeah, that sounds like a rounding issue. No idea whether it’s within
SDL or your hardware’s fault (since I know some low-end GPUs are
rather lackluster when it comes to it).


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

I’ve got a Geforce GTX 760 in this machine. I also tested it with the same results on a box with a Geforce GTS 250. I just tested it on Android (a Galaxy S4), and 1920x1080 actual resolution on there looks fine! Hrm…

Have you tried setting the SDL_HINT_RENDER_SCALE_QUALITY hint to 0 (nearest neighborhood) ?

http://www.gamedev.net/topic/627546-solved-spritesheet-bleeding/

Note this thread is from 2012 and the OP’s use of OpenGL was less
than optimal in 2002 (glBegin(GL_QUADS), really?!) but it discusses
some of the problems and is worth a look.

You should not be having this problem with nearest neighbor pixel
filtering, but hardware sometimes optimizes fast in place of correct.

Is your text MOVING when the artifacts show up, and what’s your blend
function?On Wed, Jan 07, 2015 at 10:51:39AM +0000, Dark_Oppressor wrote:

I’m using the SDL2 2D graphics API, with SDL_Texture’s. When in hardware mode (tested with direct3d and opengl), but NOT when in software mode, I have a problem:

I’m making a game with a tiny resolution (256x144). I have two fonts. Each font has one image (.png) that contains all of its characters. There are 256 characters. The standard font’s characters are 8x8, so the image is 2048x8. The small font’s characters are 3x5, so the image is 768x5.
I use ‘SDL_RenderSetLogicalSize(renderer,SCREEN_WIDTH,SCREEN_HEIGHT);’ to set the logical render size to 256x144. I make the window a larger resolution (my tests are with the same aspect ratio, though).
On certain resolutions (640x360, 1920x1080) my small font (but not my standard one) is messed up. It’s fine at 256x144 obviously, and it also looks fine at 1280x720.
The weird look appears to be that at certain positions in the window, small font characters will show parts of other characters that are adjacent to them on the sprite sheet image itself.
I remembered something from the ancient past about non-power of 2 textures, so I tried padding my small font image, but that didn’t seem to make a difference. I’m using “nearest” render scale quality.
It seems to be some kind of rounding issue (move a messed up character over 1 logical pixel, and suddenly it looks fine). The best possible thing I’ve found from Googling is to try putting empty padding between sprites on the spritesheet, but that is a terribly painful solution, and I’m hoping there is something I’m missing.
Can anyone tell what my problem is?


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

2015-01-08 2:48 GMT-03:00, Dark_Oppressor :

I’ve got a Geforce GTX 760 in this machine. I also tested it with the same
results on a box with a Geforce GTS 250. I just tested it on Android (a
Galaxy S4), and 1920x1080 actual resolution on there looks fine! Hrm…

OK, that definitely looks like something to look into SDL then, I
doubt those cards have that kind of bugs.

mr_tawan wrote:

Have you tried setting the SDL_HINT_RENDER_SCALE_QUALITY hint to 0 (nearest neighborhood) ?

Yes, I’m sorry, I mentioned in my original post that I was using nearest render scale quality, but I should have made it more clear. Setting that hint to “nearest” is what I’m currently doing.

I’m running into the same problem of sprite frame bleeding.

I can say to a 99% certainty that this is on the end of SDL. I still need to look at SDL’s source code to verify where.

The frame information you pass into SDL (SDL_RenderCopy) is an SDL_Rect, which is all integers. That makes it impossible to deviate from those specifications, As long as those numbers have not changed from some calculation with floats you did beforehand, the width and height will always be correct and would not cause frame bleed. With my case, I have verified my data is always correct, and yes, my quality has been set to nearest for pixel perfect drawing, or as close to as I am offered from the library.

One thing about this bug is I only see it occur in windowed mode. When I go fullscreen the bleeding does not occur. I always use the desktop resolution and then set a logical resolution of 320x240, with windowed mode using an 800x600 window.

So until I look at the source, I can only assume SDL itself is actually using floats somewhere in the texture lookup for the source rectangle and messing it up. It’s a common mistake and with SDL2 being a new iteration of SDL, it’s not that big of a shocker it has some bugs.

This is being done on a Windows 7 x64 machine.

If anybody has anything else to add, that’d be great.------------------------
Wut

I believe the problems for the opengl renderer, which is the one I use and assume is the OP’s renderer as well, might be coming from the source file SDL_render_gl.c starting at line 1209 where it uses floats to calculate the rectangles for the texture:

Code:

minx = dstrect->x;
miny = dstrect->y;
maxx = dstrect->x + dstrect->w;
maxy = dstrect->y + dstrect->h;

minu = (GLfloat) srcrect->x / texture->w;
minu *= texturedata->texw;
maxu = (GLfloat) (srcrect->x + srcrect->w) / texture->w;
maxu *= texturedata->texw;
minv = (GLfloat) srcrect->y / texture->h;
minv *= texturedata->texh;
maxv = (GLfloat) (srcrect->y + srcrect->h) / texture->h;
maxv *= texturedata->texh;------------------------

Wut

Maybe some kind soul could check it out and compile to see? I believe ( )'s surrounding the integer calculations before the cast to GLFloat might be what is needed to prevent the issue… I may be wrong, but I don’t think the GLfloat cast is casting the entire op and only the lefthand operand of the divisions… I could be wrong but it would make sense considering the issue.------------------------
Wut

Or SDL could use the integer form of the GL functions, instead. Why use floats anyway? Anyone who calls the RenderCopy function call can only pass integers to it in the form of SDL_Rect…------------------------
Wut

So after all that and a bug report, it comes down to the GPU handling of texels. The way around it is to just add an empty pixel border around every single image and sprite frame you do. It’s not fun, but it’s worth it… grumble… grumble…------------------------
Wut