Render targets, scale quality, and switching fullscreen problem

Off the bat, I am very much a beginner C++ and SDL, so I apologize if I’m missing something obvious here. I have been spending some time trying to get to grips with basic SDL useage, and have been unable to overcome a problem I’ve encountered.

I was investigating texture rendering in the context of low resolution pixel-art, and attempted to implement a fullscreen scaling system beyond simply stretching a low-res image with nearest neighbour scaling. The purpose behind this was to avoid uneven nearest neighbour stretching that occurs when scaling images with non-integer multiples (as what often happens when scaling small images to a fullscreen resolution).

The design was simple:

  1. Scale the original image with nearest neighbour interpolation to a size close to the final desired output size, sticking with integer multiples to avoid unevenness.
  2. Scale again to fit the screen resolution using linear interpolation. This results in a slight softness, but is more desirable to my eye than using non-integer nearest neighbour scaling by itself.

My implementation below is the problem. It only somewhat works. It gives my intended result initially, however fails upon toggling the window from fullscreen and back (spacebar in my example). While rendering still occurs after this, it is highly blurred, appearing as though the “nearest” scaling qualities are being ignored.

#include "SDL.h"

int main(int argc, char *argv[]){

//Relevant texture initialization
SDL_Renderer *renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_TARGETTEXTURE | SDL_RENDERER_ACCELERATED);

SDL_Surface *temp = SDL_LoadBMP("testimage.bmp");
SDL_Texture *texture = SDL_CreateTextureFromSurface(renderer, temp);

SDL_Texture *renderTexture = SDL_CreateTexture(renderer, SDL_GetWindowPixelFormat(window), SDL_TEXTUREACCESS_TARGET, 400, 240);
SDL_Texture *scaleTexture = SDL_CreateTexture(renderer, SDL_GetWindowPixelFormat(window), SDL_TEXTUREACCESS_TARGET, 2000, 1200);		

//Largely irrelevant (I believe), event handling.  Only point of note is that setting the window to fullscreen
//happens here
bool gameloop = true;
bool fullscreen =true;

SDL_Event e;

while (gameloop){
	while (SDL_PollEvent(&e)){
		if (e.type == SDL_QUIT)
			gameloop = false;
		else if (e.type == SDL_KEYUP && e.key.keysym.scancode == SDL_SCANCODE_ESCAPE)
			gameloop = false;
		else if (e.type == SDL_KEYUP && e.key.keysym.scancode == SDL_SCANCODE_SPACE){
			if (fullscreen)
				SDL_SetWindowFullscreen(window, 0);
				SDL_SetWindowFullscreen(window, SDL_WINDOW_FULLSCREEN_DESKTOP);
			fullscreen = !fullscreen;
//Relevant rendering code
	SDL_SetRenderTarget(renderer, renderTexture);
//Although this call to RenderCopy could introduce scaling problems, I believe it to be a non-issue 
//in my case as texture and renderTexture have equal sizes.
	SDL_RenderCopy(renderer, texture, nullptr, nullptr);  
	SDL_SetRenderTarget(renderer, scaleTexture);
	SDL_RenderCopy(renderer, renderTexture, nullptr, nullptr);
	SDL_SetRenderTarget(renderer, nullptr);
	SDL_RenderCopy(renderer, scaleTexture, nullptr, nullptr);

	SDL_Delay(1000 / 60);

return 0;

I admit to some confusion regarding the use of SDL_SetHint, but I believe I’m correct in that it only needs to be set before texture creation, and not before the various SDL_Render… functions.

In trying to solve my problem, I found that the scaling works correctly, if renderTexture is taken out entirely, and texture is rendered directly to scaleTexture. Maybe there is something I’m missing regarding rendering from a target texture to another target texture?
In any case, while this works for this little test, I feel it is not an appropriate solution. In a larger project it would necessitate any pixel-art rendering to be scaled individually, rather than scaling renderTexture the once.

As mentioned, I’ve been unable to find a solution to my problem. Hopefully I was able to explain myself well enough. Any insights would be greatly appreciated.

After a bit more delving into SDL, I was able to ‘fix’ this problem by setting the render driver to opengl before creating the renderer, rather than letting it go to the direct3d default (I should have perhaps mentioned that I was using Windows before…sorry!). The only edit needed is shown below.

SDL_SetHint(SDL_HINT_RENDER_DRIVER, "opengl");    //New Line
SDL_Renderer *renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_TARGETTEXTURE | SDL_RENDERER_ACCELERATED);

With this change, my code works as I previously expected. Explicitly setting the driver to be direct3d however creates the same issue as mentioned earlier. Although I am not against the use of openGL, and am happy enough that things are working, I do wonder as to what could be causing the difference in how the code behaves.

I really don’t think it should work like that, but both DirectX and OpenGL are large annoying beasties. Does this only happen only when in a window, or does it still happen after switching back and forth?

Thanks for the reply Blerg. I don’t know enough to make any calls as to whether this is correct behaviour or not, but I did find it strange that the code behaves differently depending.

I’m far from proficient enough to say whether the issue still occurs or not, when windowed. Comparing the windowed output and the original input file with an image editor however shows no difference with either direct3d or openGL modes.

In my example, the window size matches the resolution of the input image that I have been using. If the issue does remain when windowed in direct3d, I feel that this matched scale could potentially be masking it.

I’m reasonably confident that the issue lays someplace with the two render target textures, renderTexture and scaleTexture. Presumably, the issue still occurs with direct3d when windowed, as those two textures are still being used in the same manner.

Kinda thought it might be like that. I used to see this kind of issue when I was more into the emulator scene. If you take screenshots within your program, and they’re the same with both renderers, fullscreen and windowed; then to me it implies that maybe your graphics driver is messing with it. Make sure you’re pulling the screenshots directly from the renderer though, or you may fooling yourself. Also, (it shouldn’t be like this, but this is for troubleshooting purposes), make sure that the window sizes are the same after changing between fullscreen modes.

Hmmm, maybe also try just using DirectX instead of DirectX11? BLURRY SCREEN AT 1920X1080 AFTER PATCH

Both OpenGL and Direct3D appear to have the same output when windowed, but OpenGL works (correctly, as far as I can tell) in fullscreen whilst Direct3D does not.

Make sure you’re pulling the screenshots directly from the renderer

I’ve simply been using printscreen to grab images. Is this not sufficient?

Hmmm, maybe also try just using DirectX instead of DirectX11?

The link you provided could be a similar issue, but I cannot say that I’ve encounted any form of ‘vibration’ on my output as mentioned.

Regarding DirectX vs. DirectX11, how would I go about testing this? I presumed that setting SDL_HINT_RENDER_DRIVER to “direct3d” was DirectX 9.

Trying to find anything on this got me to this link. Although not related to my problem, it mentions DirectX 11 was disabled by default, at least at the time of SDL 2.0.4. I don’t pretend to understand the information there, but the writer’s conclusion that DirectX 11 doesn’t seem to work on Windows 7 seems to preclude it being an issue for me, having stayed on Windows 7 myself.

Oooops, sorry, I meddle with the source code to SDL2 every now and again and make poor assumptions about what people do and do not know. You should be able to accomplish this by using: SDL_SetHint with SDL_HINT_RENDER_DRIVER as the hint with either of these to change which DirectX is being used: “direct3d”, “direct3d11”, (direct3d11 is missing from the wiki right now). No quotes, and they are case sensitive.

Should be more than sufficient. I’d be inclined to believe that DPI scaling was a culprit if OpenGL wasn’t working for you.

Cheers for the tips Blerg.

Setting the render driver to “direct3d11” gave me the same results as “direct3d” did. Comparing the results in an image editor showed them to be identical.

The only thing I can think of now is that your graphics card driver is messing with the final image before it’s being displayed on your monitor. If you try using just surfaces instead of textures, do the results change?

I know most graphics card can disable or force options such as anti-aliasing and the like. If your code has no windows specific code in it, you can load up a Linux virtual machine and test it out that way.

If I use surfaces only, I miss out on the linear texture scaling entirely, getting a nearest neighbour scaled result. I believe SDL_HINT_RENDER_SCALE_QUALITY only affects textures.

Good thought regarding graphics driver options, but fiddling around with the available settings didn’t fix anything either.

At this point, I’m really only interested in the problem through curiosity. This, combined with my lack of knowledge, makes testing in a Linux virtual machine a bit too daunting really. Since OpenGL works fine, it’s far from a show stoppper problem for me.

You and me both. I’m still leaning towards the monitor or graphics card doing something funky to the texture when presented. The OS thinks they’re both the same since the screenshots are similar to you.

This will probably happen with your game sooner or later, so you might as well try to resize the screen a few times after leaving fullscreen mode to see if the issue still persists/gets worse. Unless of course, you’re using unresizable windows.