Hi everyone! I am new to this forum and I take this entry to greet this whole community. When I use the SDL_RENDER_ACCELERATED flag in the SDL_CreateRenderer function, I get strange graphics when scaling the window using SDL_RenderSetLogicalSize, which does not happen if I use the SDL_RENDERER_SOFTWARE flag. It is striking to enable hardware-accelerated rendering due to the remarkable performance that is obtained, but the graphic result is unpleasant. Why it happens?

Sorry for having to upload the images to another domain, but I’m a new user and the forum does not allow me to place more than one at the moment.

SDL_RENDER_ACCELERATED, look at the weird lines that form on each tile:

SDL_RENDER_ACCELERATED with SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, “1”), the problem is more noticeable:

SDL_RENDER_SOFTWARE (how I want it to really look):

This is graphic class, where I carry out all the necessary functions to draw (graphics.cpp):

#include "graphics.h"
#include "game.h"

#include <SDL2/SDL.h>

Graphics::Graphics() {
	window_ = SDL_CreateWindow("Cave Story Clone - SDL2",

	// TODO: fix weird graphics when is used SDL_RENDERER_ACCELERATED
	renderer_ = SDL_CreateRenderer(window_, -1,

	SDL_RenderSetLogicalSize(renderer_, units::tileToPixel(Game::kScreenWidth),


Graphics::~Graphics() {
	for (SpriteMap::iterator iter = spr_sheets_.begin();
			iter != spr_sheets_.end();
			iter++) {


SDL_Texture* Graphics::surfaceToTexture(SDL_Surface* surface) {
	return SDL_CreateTextureFromSurface(renderer_, surface);

Graphics::SurfaceID Graphics::loadImage(const std::string& file_name, bool black_to_alpha) {
	const std::string file_path = config::getGraphicsQuality() == config::ORIGINAL ?
		"assets/" + file_name + ".pbm" :
		"assets/" + file_name + ".bmp";

	// if we have not loaded in the spritesheet
	if (spr_sheets_.count(file_path) == 0) {
		// load it in now
		SDL_Surface* image = SDL_LoadBMP(file_path.c_str());

		if (!image) {
			fprintf(stderr, "Could not find image: %s\n", file_path.c_str());

		if (black_to_alpha) {
			const Uint32 black_color = SDL_MapRGB(image->format, 0, 0, 0);
			SDL_SetColorKey(image, SDL_TRUE, black_color);

		spr_sheets_[file_path] = surfaceToTexture(image);//SDL_CreateTextureFromSurface(renderer_, image);

	return spr_sheets_[file_path];

void Graphics::render(
		SurfaceID source,
		SDL_Rect* source_rectangle,
		SDL_Rect* destination_rectangle) {
	if (source_rectangle) {
		destination_rectangle->w = source_rectangle->w;
		destination_rectangle->h = source_rectangle->h;
	} else {
		uint32_t format;
		int access, w, h;
		SDL_QueryTexture(source, &format, &access, &w, &h);
		destination_rectangle->w = w;
		destination_rectangle->h = h;

	SDL_RenderCopy(renderer_, source, source_rectangle, destination_rectangle);

void Graphics::clear() {

void Graphics::flip() {

void Graphics::setFullscreen() {
	windowed = !windowed;

	if (windowed) {
		SDL_SetWindowFullscreen(window_, 0);
	} else {
		SDL_SetWindowFullscreen(window_, SDL_WINDOW_FULLSCREEN_DESKTOP);

And this is my sprite class (sprite.cpp):

#include "sprite.h"
#include "graphics.h"

		Graphics& graphics,
		const std::string& file_name,
		units::Pixel source_x, units::Pixel source_y,
		units::Pixel width, units::Pixel height) {
	const bool black_to_alpha = true;

	spr_sheet_ = graphics.loadImage(file_name, black_to_alpha);
	source_rect_.x = source_x;
	source_rect_.y = source_y;
	source_rect_.w = width;
	source_rect_.h = height;

void Sprite::draw(Graphics& graphics, units::Game x, units::Game y, SDL_Rect* camera) {
	SDL_Rect destination_rectangle;
	if (!camera) {
		destination_rectangle.x = units::gameToPixel(x);
		destination_rectangle.y = units::gameToPixel(y);
	} else {
		destination_rectangle.x = units::gameToPixel(x) - camera->x;
		destination_rectangle.y = units::gameToPixel(y) - camera->y;
	graphics.render(spr_sheet_, &source_rect_, &destination_rectangle);

It should be noted that each object on the screen is an individual sprite, and the lines are formed in each of them.

The program runs with a resolution of 640x480, and the resolution of my monitor, to which it is scaled, is 1920x1080. I use Arch Linux with X11 and I’m not calling OpenGL functions to draw, only the SDL2 API for rendering.

I feel my bad english, because I am really spanish speaking. Thanks for everything, even for taking the time to read me, and happy day for everyone.

¡Gracias a todos!


I suspect it’s the texture filtering. If you don’t create your game’s artwork to work around the fact that the GPU will pull in surrounding pixels when sampling the texture, you can get graphical artifacts like this.

Try turning texture filtering off. Instead of SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, “1”) (which sets it to bilinear texture filtering), do SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, “0”) (nearest neighbor filtering, which is no filtering at all), as mentioned in the documentation here

The reason it doesn’t show up in the unaccelerated software renderer is because that isn’t doing any texture filtering.


Oh, I also thought that and tried putting that flag to 0, but the problem still persists. In fact, the first capture uses that flag with that value. Thanks!


I had a similar problem with tiled textures. It seems that the pixels on the one edge of the texture don’t match pixels on the other edge in the way you’re seeing. I had to add a border of blank pixels on every edge of my textures and then bring in the source coordinates on every edge to compensate when doing SDL_RenderCopy.

Without this, different video cards caused different artefacts.


One of the solutions I’ve thought of is to render all the textures in one, and then scale that when doing the SDL_RenderCopyEx, but I do not think it’s the most efficient solution.


I think performance in this case would be fine, but you will lose the ability to have a smooth scrolling camera, which may or may not be a problem for you.

I had a very similar issue with a tile-based game long ago (not using SDL render). The fix I came up with was to slightly offset the texture coordinates, to make sure the sampling always happened within the source rect.

So suppose normally the source rect in pixels would be (100, 80) - (120, 100), I would draw it into the same destination rect, but with source coordinates (100.02, 80.02) - (119.98, 99.98).


I wonder if it might be helpful to disable texture wrapping (assuming SDL doesn’t already). Using the OpenGL back end this would be:



That OpenGL setting helps a bit, but isn’t a universal fix. Even if you find the equivalent Direct3D setting, there’s always a chance the user has set their Nvidia settings to override it. Offsetting the source texture coordinates by 1 pixel on each edge fixed everything for me.

The other way to do it is render the pixels yourself to a target texture (pretty quick once you work out how), or just use the software renderer.


Texture wrapping is probably never useful in SDL, so if it can cause artefacts of this sort it would perhaps be desirable for it to be disabled by default (or at least to have a hint for it) when the backend supports that option.


An apparent solution is to use SDL_RenderSetIntegerScale(renderer, SDL_TRUE), but it will scale the resolution to integer values, so it is very likely that it will not cover the entire screen.


Letting the hardware clamp the texture coordinates wouldn’t help, since the code uses a sprite sheet, and the sprites are sourced from a part of a bigger texture.

I just realized SDL_Rect contains integer coordinates, so my solution wouldn’t help either (unless there’s a way to draw images with floating point precision). Adding a border and adjusting coords as suggested above may help.


The option that worked best for me is to render the scene in a backbuffer (another texture) that I then render with the source_rectangle and the destination_rectangle equal to NULL. I declare this thread as resolved. Thank you all!


I would also like to highlight that you need to use quotes around the number. SDL_SetHint takes a string, not an integer, and I messed this up before when I tried solving the same issue, thinking that the SDL_HINT_RENDER_SCALE_QUALITY fix didn’t seem to have an effect.