SDL 2.0 Rendering SDL_Texture

Hi.

I am learning SDL 2.0 and I have some questions (problems) regarding rendering techniques that I was hoping someone qould help me with. :slight_smile:

First of all, I am going to post what my code for rendering looks like right now.
(I will leave out stuff like error checking etc for the sake of readability)

Initialization
Code:

_renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC);
SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, “linear”);

SDL_RenderSetLogicalSize(
_renderer,
1000,
1000
);

_backbuffer = SDL_CreateTexture(
_renderer,
SDL_PIXELFORMAT_RGBA8888,
SDL_TEXTUREACCESS_TARGET | SDL_TEXTUREACCESS_STREAMING,
1000,
1000
);

Drawing
Code:

SDL_SetRenderDrawColor(_renderer, 100, 149, 237, 255);
SDL_RenderClear(_renderer);

SDL_SetRenderTarget(_renderer, _backbuffer);
SDL_RenderCopy(renderer, _texture, nullptr, &dest);

SDL_SetRenderTarget(_renderer, nullptr);
SDL_RenderCopy(_renderer, _backbuffer, 0, 0);
SDL_DestroyTexture(_backbuffer);
SDL_RenderPresent(_renderer)
);

I am basically using this for drawing sprites and a tiles as well as some minor pixel effects.
At the moment I am only using SDL_Textures, not SDL_Surfaces.

The problems and questions I have is…

  1. I can’t get SDL_RenderSetLogicalSize(_renderer, 1000, 1000); to work as intended.
    It says in the imigration guide that…

For 2.0, the render API lets you do this…

Code:
SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, “linear”);
SDL_RenderSetLogicalSize(sdlRenderer, 640, 480);

…and it will do the right thing for you. This is nice in that you can change the logical rendering size to achieve various effects, but the primary use is this: instead of trying to make the system work with your rendering size, we can now make your rendering size work with the system. On my 1920x1200 monitor, this app thinks it’s talking to a 640x480 resolution now, but SDL is using the GPU to scale it up to use all those pixels. Note that 640x480 and 1920x1200 aren’t the same aspect ratio: SDL takes care of that, too, scaling as much as possible and letterboxing the difference.

…but unfortunately, I am not getting that effect. There is no scaling going on at all.
Is it because I am using SDL_RenderCopy(renderer, _texture, nullptr, &dest) instead of SDL_UpdateTexture(sdlTexture, NULL, myPixels, 640 * sizeof (Uint32)) as the guide suggests?

  1. Is this method of rendering considered “correct” and is it effective?
    As I said, the guide uses SDL_UpdateTexture, which I was not able to get to work. If I understand it correctly you need lo lock the texture to access its pixels, but I was not able to do that, could someone please provide me with a minimal example? And also, on that note, what is the literal integer value 640 representing in the code SDL_UpdateTexture(sdlTexture, NULL, myPixels, 640 * sizeof (Uint32))?

Maybe some of these questions have been answered before, but I have not been able to find an answer to them, so if anyone is able to help me out, thanks in advanced! / AS

AlanSmithee wrote:

Hi.

I am learning SDL 2.0 and I have some questions (problems) regarding rendering techniques that I was hoping someone qould help me with. :slight_smile:

First of all, I am going to post what my code for rendering looks like right now.
(I will leave out stuff like error checking etc for the sake of readability)

Initialization
Code:

_renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC);
SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, “linear”);

SDL_RenderSetLogicalSize(
_renderer,
1000,
1000
);

_backbuffer = SDL_CreateTexture(
_renderer,
SDL_PIXELFORMAT_RGBA8888,
SDL_TEXTUREACCESS_TARGET | SDL_TEXTUREACCESS_STREAMING,
1000,
1000
);

Drawing
Code:

SDL_SetRenderDrawColor(_renderer, 100, 149, 237, 255);
SDL_RenderClear(_renderer);

SDL_SetRenderTarget(_renderer, _backbuffer);
SDL_RenderCopy(renderer, _texture, nullptr, &dest);

SDL_SetRenderTarget(_renderer, nullptr);
SDL_RenderCopy(_renderer, _backbuffer, 0, 0);
SDL_DestroyTexture(_backbuffer);
SDL_RenderPresent(_renderer)
);

I am basically using this for drawing sprites and a tiles as well as some minor pixel effects.
At the moment I am only using SDL_Textures, not SDL_Surfaces.

The problems and questions I have is…

  1. I can’t get SDL_RenderSetLogicalSize(_renderer, 1000, 1000); to work as intended.
    It says in the imigration guide that…

For 2.0, the render API lets you do this…

Code:
SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, “linear”);
SDL_RenderSetLogicalSize(sdlRenderer, 640, 480);

…and it will do the right thing for you. This is nice in that you can change the logical rendering size to achieve various effects, but the primary use is this: instead of trying to make the system work with your rendering size, we can now make your rendering size work with the system. On my 1920x1200 monitor, this app thinks it’s talking to a 640x480 resolution now, but SDL is using the GPU to scale it up to use all those pixels. Note that 640x480 and 1920x1200 aren’t the same aspect ratio: SDL takes care of that, too, scaling as much as possible and letterboxing the difference.

…but unfortunately, I am not getting that effect. There is no scaling going on at all.
Is it because I am using SDL_RenderCopy(renderer, _texture, nullptr, &dest) instead of SDL_UpdateTexture(sdlTexture, NULL, myPixels, 640 * sizeof (Uint32)) as the guide suggests?

  1. Is this method of rendering considered “correct” and is it effective?
    As I said, the guide uses SDL_UpdateTexture, which I was not able to get to work. If I understand it correctly you need lo lock the texture to access its pixels, but I was not able to do that, could someone please provide me with a minimal example? And also, on that note, what is the literal integer value 640 representing in the code SDL_UpdateTexture(sdlTexture, NULL, myPixels, 640 * sizeof (Uint32))?

Maybe some of these questions have been answered before, but I have not been able to find an answer to them, so if anyone is able to help me out, thanks in advanced! / AS

First, don’t get too hung up on SDL_UpdateTexture(). The purpose of that function is to copy pixel data from system RAM (generally, a surface) into GPU RAM (a texture). It has nothing to do with rendering, other than allowing you to put something into a texture before rendering it.

Second, rendering from _texture to _backbuffer, and then _backbuffer to the screen is wasteful - you get the same effect by just rendering _texture directly to the screen. There’s no need for a “backbuffer” - the render to screen will automatically go to the driver’s backbuffer (okay, I’m not sure it’s the driver’s backbuffer - could be OpenGL/DirectX/etc handling this), and not be swapped until you call SDL_RenderPresent(). There are good reasons to render to a texture (such as combining multiple sprites into a single sprite which will be rendered multiple times), but your code doesn’t appear to contain one of them.

As to why you aren’t getting any scaling - where exactly are you expecting scaling to occur? On the first RenderCopy(), the values in SDL_SetRenderLogicalSize() have no effect (that’s only for when rendering to the screen). On the second RenderCopy(), the logical size and the size of the texture are the same, so no scaling will occur.

As to that last parameter for SDL_UpdateTexture(), it’s the width, in bytes, of one row from the image being transferred into the texture. In the example, it’s copying from a 640x480 image in RAM, with 32 bits per pixel, so one row is 640 x 4 bytes long.

One final note: The “access” parameter of SDL_CreateTexture() is not a bitmap. It’s an enum. I have no clue what OR’ing SDL_TEXTUREACCESS_STREAMING and SDL_TEXTUREACCESS_TARGET will do, but definitely not what you were expecting it to do. Just pick one :slight_smile:

lloyd_b wrote:

AlanSmithee wrote:

Hi.

I am learning SDL 2.0 and I have some questions (problems) regarding rendering techniques that I was hoping someone qould help me with. :slight_smile:

First of all, I am going to post what my code for rendering looks like right now.
(I will leave out stuff like error checking etc for the sake of readability)

Initialization
Code:

_renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC);
SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, “linear”);

SDL_RenderSetLogicalSize(
_renderer,
1000,
1000
);

_backbuffer = SDL_CreateTexture(
_renderer,
SDL_PIXELFORMAT_RGBA8888,
SDL_TEXTUREACCESS_TARGET | SDL_TEXTUREACCESS_STREAMING,
1000,
1000
);

Drawing
Code:

SDL_SetRenderDrawColor(_renderer, 100, 149, 237, 255);
SDL_RenderClear(_renderer);

SDL_SetRenderTarget(_renderer, _backbuffer);
SDL_RenderCopy(renderer, _texture, nullptr, &dest);

SDL_SetRenderTarget(_renderer, nullptr);
SDL_RenderCopy(_renderer, _backbuffer, 0, 0);
SDL_DestroyTexture(_backbuffer);
SDL_RenderPresent(_renderer)
);

I am basically using this for drawing sprites and a tiles as well as some minor pixel effects.
At the moment I am only using SDL_Textures, not SDL_Surfaces.

The problems and questions I have is…

  1. I can’t get SDL_RenderSetLogicalSize(_renderer, 1000, 1000); to work as intended.
    It says in the imigration guide that…

For 2.0, the render API lets you do this…

Code:
SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, “linear”);
SDL_RenderSetLogicalSize(sdlRenderer, 640, 480);

…and it will do the right thing for you. This is nice in that you can change the logical rendering size to achieve various effects, but the primary use is this: instead of trying to make the system work with your rendering size, we can now make your rendering size work with the system. On my 1920x1200 monitor, this app thinks it’s talking to a 640x480 resolution now, but SDL is using the GPU to scale it up to use all those pixels. Note that 640x480 and 1920x1200 aren’t the same aspect ratio: SDL takes care of that, too, scaling as much as possible and letterboxing the difference.

…but unfortunately, I am not getting that effect. There is no scaling going on at all.
Is it because I am using SDL_RenderCopy(renderer, _texture, nullptr, &dest) instead of SDL_UpdateTexture(sdlTexture, NULL, myPixels, 640 * sizeof (Uint32)) as the guide suggests?

  1. Is this method of rendering considered “correct” and is it effective?
    As I said, the guide uses SDL_UpdateTexture, which I was not able to get to work. If I understand it correctly you need lo lock the texture to access its pixels, but I was not able to do that, could someone please provide me with a minimal example? And also, on that note, what is the literal integer value 640 representing in the code SDL_UpdateTexture(sdlTexture, NULL, myPixels, 640 * sizeof (Uint32))?

Maybe some of these questions have been answered before, but I have not been able to find an answer to them, so if anyone is able to help me out, thanks in advanced! / AS

First, don’t get too hung up on SDL_UpdateTexture(). The purpose of that function is to copy pixel data from system RAM (generally, a surface) into GPU RAM (a texture). It has nothing to do with rendering, other than allowing you to put something into a texture before rendering it.

Second, rendering from _texture to _backbuffer, and then _backbuffer to the screen is wasteful - you get the same effect by just rendering _texture directly to the screen. There’s no need for a “backbuffer” - the render to screen will automatically go to the driver’s backbuffer (okay, I’m not sure it’s the driver’s backbuffer - could be OpenGL/DirectX/etc handling this), and not be swapped until you call SDL_RenderPresent(). There are good reasons to render to a texture (such as combining multiple sprites into a single sprite which will be rendered multiple times), but your code doesn’t appear to contain one of them.

As to why you aren’t getting any scaling - where exactly are you expecting scaling to occur? On the first RenderCopy(), the values in SDL_SetRenderLogicalSize() have no effect (that’s only for when rendering to the screen). On the second RenderCopy(), the logical size and the size of the texture are the same, so no scaling will occur.

As to that last parameter for SDL_UpdateTexture(), it’s the width, in bytes, of one row from the image being transferred into the texture. In the example, it’s copying from a 640x480 image in RAM, with 32 bits per pixel, so one row is 640 x 4 bytes long.

One final note: The “access” parameter of SDL_CreateTexture() is not a bitmap. It’s an enum. I have no clue what OR’ing SDL_TEXTUREACCESS_STREAMING and SDL_TEXTUREACCESS_TARGET will do, but definitely not what you were expecting it to do. Just pick one :slight_smile:

Hi.

Thanks for your answer!

I removed the redundant buffering as I, like you said, doesn’t have any need for it at the moment.
Thanks for explaining when it can be useful though, I will keep it in mind, should I need it later.

Also, I will forget about SDL_UpdateTexture and stick to using SDL_RenderCopy.

About the access parameter of SDL_CreateTexture(), I missunderstood it for taking a bitmask, I read the wiki and I understand how it works now…
Even though I won’t need to worry about it for now as I removed the backbuffer (which was the only time I used SDL_CreateTexture).
Thinking of that, how do you set the access flag of a SDL_Texture when it is created using SDL_CreateTextureFromSurface() ?

Regarding SDL_RenderSetLogicalSize the migration guide says…

the primary use is this: instead of trying to make the system work with your rendering size, we can now make your rendering size work with the system. On my 1920x1200 monitor, this app thinks it’s talking to a 640x480 resolution now, but SDL is using the GPU to scale it up to use all those pixels. Note that 640x480 and 1920x1200 aren’t the same aspect ratio: SDL takes care of that, too, scaling as much as possible and letterboxing the difference.

…so what I expect is that I can set the logical size of the renderer to 1000x1000 pixels, to make it easier to work with, and when it is time to render the renderer, it will take care of scaling that 1000x1000 texture to 1280x1024 pixels or whatever size the monitor is (im using fullscreen mode).
Have I missunderstood this maybe?

Thinking about it, this might not be the effect that I am after, since I already have a camera class that takes care of logic-to-view and view-to-logic conversion, but it would still be nice to know the correct way to use SDL_RenderSetLogicalSize.

Thanks for your help so far! :slight_smile: /AS

AlanSmithee wrote:

lloyd_b wrote:

AlanSmithee wrote:

Hi.

I am learning SDL 2.0 and I have some questions (problems) regarding rendering techniques that I was hoping someone qould help me with. :slight_smile:

First of all, I am going to post what my code for rendering looks like right now.
(I will leave out stuff like error checking etc for the sake of readability)

Initialization
Code:

_renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC);
SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, “linear”);

SDL_RenderSetLogicalSize(
_renderer,
1000,
1000
);

_backbuffer = SDL_CreateTexture(
_renderer,
SDL_PIXELFORMAT_RGBA8888,
SDL_TEXTUREACCESS_TARGET | SDL_TEXTUREACCESS_STREAMING,
1000,
1000
);

Drawing
Code:

SDL_SetRenderDrawColor(_renderer, 100, 149, 237, 255);
SDL_RenderClear(_renderer);

SDL_SetRenderTarget(_renderer, _backbuffer);
SDL_RenderCopy(renderer, _texture, nullptr, &dest);

SDL_SetRenderTarget(_renderer, nullptr);
SDL_RenderCopy(_renderer, _backbuffer, 0, 0);
SDL_DestroyTexture(_backbuffer);
SDL_RenderPresent(_renderer)
);

I am basically using this for drawing sprites and a tiles as well as some minor pixel effects.
At the moment I am only using SDL_Textures, not SDL_Surfaces.

The problems and questions I have is…

  1. I can’t get SDL_RenderSetLogicalSize(_renderer, 1000, 1000); to work as intended.
    It says in the imigration guide that…

For 2.0, the render API lets you do this…

Code:
SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, “linear”);
SDL_RenderSetLogicalSize(sdlRenderer, 640, 480);

…and it will do the right thing for you. This is nice in that you can change the logical rendering size to achieve various effects, but the primary use is this: instead of trying to make the system work with your rendering size, we can now make your rendering size work with the system. On my 1920x1200 monitor, this app thinks it’s talking to a 640x480 resolution now, but SDL is using the GPU to scale it up to use all those pixels. Note that 640x480 and 1920x1200 aren’t the same aspect ratio: SDL takes care of that, too, scaling as much as possible and letterboxing the difference.

…but unfortunately, I am not getting that effect. There is no scaling going on at all.
Is it because I am using SDL_RenderCopy(renderer, _texture, nullptr, &dest) instead of SDL_UpdateTexture(sdlTexture, NULL, myPixels, 640 * sizeof (Uint32)) as the guide suggests?

  1. Is this method of rendering considered “correct” and is it effective?
    As I said, the guide uses SDL_UpdateTexture, which I was not able to get to work. If I understand it correctly you need lo lock the texture to access its pixels, but I was not able to do that, could someone please provide me with a minimal example? And also, on that note, what is the literal integer value 640 representing in the code SDL_UpdateTexture(sdlTexture, NULL, myPixels, 640 * sizeof (Uint32))?

Maybe some of these questions have been answered before, but I have not been able to find an answer to them, so if anyone is able to help me out, thanks in advanced! / AS

First, don’t get too hung up on SDL_UpdateTexture(). The purpose of that function is to copy pixel data from system RAM (generally, a surface) into GPU RAM (a texture). It has nothing to do with rendering, other than allowing you to put something into a texture before rendering it.

Second, rendering from _texture to _backbuffer, and then _backbuffer to the screen is wasteful - you get the same effect by just rendering _texture directly to the screen. There’s no need for a “backbuffer” - the render to screen will automatically go to the driver’s backbuffer (okay, I’m not sure it’s the driver’s backbuffer - could be OpenGL/DirectX/etc handling this), and not be swapped until you call SDL_RenderPresent(). There are good reasons to render to a texture (such as combining multiple sprites into a single sprite which will be rendered multiple times), but your code doesn’t appear to contain one of them.

As to why you aren’t getting any scaling - where exactly are you expecting scaling to occur? On the first RenderCopy(), the values in SDL_SetRenderLogicalSize() have no effect (that’s only for when rendering to the screen). On the second RenderCopy(), the logical size and the size of the texture are the same, so no scaling will occur.

As to that last parameter for SDL_UpdateTexture(), it’s the width, in bytes, of one row from the image being transferred into the texture. In the example, it’s copying from a 640x480 image in RAM, with 32 bits per pixel, so one row is 640 x 4 bytes long.

One final note: The “access” parameter of SDL_CreateTexture() is not a bitmap. It’s an enum. I have no clue what OR’ing SDL_TEXTUREACCESS_STREAMING and SDL_TEXTUREACCESS_TARGET will do, but definitely not what you were expecting it to do. Just pick one :slight_smile:

Hi.

Thanks for your answer!

I removed the redundant buffering as I, like you said, doesn’t have any need for it at the moment.
Thanks for explaining when it can be useful though, I will keep it in mind, should I need it later.

Also, I will forget about SDL_UpdateTexture and stick to using SDL_RenderCopy.

About the access parameter of SDL_CreateTexture(), I missunderstood it for taking a bitmask, I read the wiki and I understand how it works now…
Even though I won’t need to worry about it for now as I removed the backbuffer (which was the only time I used SDL_CreateTexture).
Thinking of that, how do you set the access flag of a SDL_Texture when it is created using SDL_CreateTextureFromSurface() ?

Regarding SDL_RenderSetLogicalSize the migration guide says…

the primary use is this: instead of trying to make the system work with your rendering size, we can now make your rendering size work with the system. On my 1920x1200 monitor, this app thinks it’s talking to a 640x480 resolution now, but SDL is using the GPU to scale it up to use all those pixels. Note that 640x480 and 1920x1200 aren’t the same aspect ratio: SDL takes care of that, too, scaling as much as possible and letterboxing the difference.

…so what I expect is that I can set the logical size of the renderer to 1000x1000 pixels, to make it easier to work with, and when it is time to render the renderer, it will take care of scaling that 1000x1000 texture to 1280x1024 pixels or whatever size the monitor is (im using fullscreen mode).
Have I missunderstood this maybe?

Thinking about it, this might not be the effect that I am after, since I already have a camera class that takes care of logic-to-view and view-to-logic conversion, but it would still be nice to know the correct way to use SDL_RenderSetLogicalSize.

Thanks for your help so far! :slight_smile: /AS
If you use SDL_CreateTextureFromSurface(), you always get a texture with the access parameter set to SDL_TEXTUREACCESS_STATIC. If you need a texture with SDL_TEXTUREACCESS_STREAMING, you create the texture, and then update it via the SDL_UpdateTexture() function (take a look at the source for SDL_CreateTextureFromSurface() - it’s basically just a wrapper around CreateSurface and UpdateSurface, along with some code to handle mismatched pixel formats).

I was mistaken on the SDL_RenderSetrLogicalSize() comment - your understanding of how it’s supposed to work is correct. I’m wondering if there’s no real scaling going on because the display Y (1024) and the logical Y (1000) are so close together that the scaling is not noticeable - the scaling will not stretch the image in the X axis.

Hi again!

Thanks for the info on SDL_CreateTextureFromSurface and related functions!
Static access is what I want 99% of the time, but should I need to change it, now I know how. :slight_smile:

the scaling will not stretch the image in the X axis.

That’s it! I didn’t understand that it did not scale on the width, that was exactly what was happening.
I think it is better if i stick with my old method, of using logical positions and dimensions and then scale it when rendering as needed!

Thanks a million for your help again, all of my questions have been answered! =)