UWP Rendering problem

Hello everyone,

I am trying to figure out a problem that I am having with porting my app to UWP.

The application is working fine expect with the rendering part. While on my development system it looks fine, once I install it on an other system it looks rubbish. I found out that if I set the flag to sdl_renderer_software when I create the renderer then I can reproduce the rubbish rendering on my development system as well. I have tried every combination and it doesn’t work.


Using the render driver opengles2 with SDL_RENDERER_SOFTWARE SDL_RENDERER_ACCELERATED SDL_RENDERER_PRESENTVSYNC SDL_RENDERER_TARGETTEXTURE. That doest even let me create a renderer it fails with invalid window. I’ve set the window flag when I create it to SDL_WINDOW_OPENGL but that didn’t change anything.

Last, using renderer driver software wit SDL_RENDERER_SOFTWARE SDL_RENDERER_ACCELERATED SDL_RENDERER_PRESENTVSYNC SDL_RENDERER_TARGETTEXTURE. Again nothing changed rubbish rendering.

What can I do? what am I missing?

*Here is a screenshot for a reference to what I mean with rubbish.

Thank you

We need to test how the software renderer decides to put this mess into the buffer.

Is it possible for you to write a small program that draws the UI elements of your application in a similar way so that the same issues are visible? This would make it much easier for us to diagnose the issue.

If not, describing what libraries, renderer settings (blend modes, scaling quality, …), and asset formats the application uses may give us at least a hint in what direction we could begin looking.

There was talk of pixel formats in the other thread, specifically SDL_PIXELFORMAT_ABGR8888. You may want to target SDL_PIXELFORMAT_ARGB8888 instead if possible. It’s better supported by the renderers.

Also make sure you got the pixel format you requested. Use SDL_QueryTexture like suggested.

surface = SDL_CreateRGBSurfaceWithFormat(0, 512, 512, 32, SDL_PIXELFORMAT_RGBA8888);

texture1 = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_ARGB8888, SDL_TEXTUREACCESS_STATIC, 512, 512);
texture2 = SDL_CreateTextureFromSurface(renderer, surface);

if (texture1) {
	Uint32 format;
	int access, w, h;
	SDL_QueryTexture(texture1, &format, &access, &w, &h);
	if (format != SDL_PIXELFORMAT_ARGB8888) {
		SDL_Log("Format of texture 1 is not ARGB888 but %s.", SDL_GetPixelFormatName(format));
	} else {
		SDL_Log("Format of texture 1 is ARGB888 as expected.");

if (texture2) {
	Uint32 format;
	int access, w, h;
	SDL_QueryTexture(texture2, &format, &access, &w, &h);
	if (format != SDL_PIXELFORMAT_RGBA8888) {
		SDL_Log("Format of texture 2 is not RGBA8888 but %s.", SDL_GetPixelFormatName(format));
	} else {
		SDL_Log("Format of texture 2 is RGBA8888 as expected.");

Sorry to intrude on this thread, but I’m interested in this comment. Are you referring specifically to UWP apps here, or to SDL2 more generally? Whilst I’m aware that ARGB8888 is the usual ‘native’ pixel format in Windows and Linux, ABGR8888 is standard on Android. So because performance is generally more critical on Android, I’ve tended to choose the latter format for my apps, on the basis that it should give a better balance of speed between the various platforms. Am I wrong?

This was a comment on what the SDL2 renderers accept without doing surface conversions. The exception would be the opengles renderer which only does ABGR8888. The opengles2 renderer, which I assume is used on all android devices, is currently always uploading with GL_RGBA and GL_UNSIGNED_BYTE and then swizzles in the shader, if necessary.

If OpenGL is used directly, one would choose the optimal path for that platform of course. You’re not wrong if that works for you. :wink:

I cannot use OpenGLES2, because it does not support ‘logical’ plotting (AND, OR, XOR) which my app relies on. So I use OpenGL on Windows/Linux/MacOs and OpenGLES on Android. Fortunately OpenGLES remains fully supported by SDL2 and by Android.

One tip: if you ever do get things working in OpenGLES 2, there is an open-source library, ANGLE, which does work with SDL2 on UWP (or at least, did at one point!), and which can emulate GLES2, at least, on top of Direct3D 11+.

Hello @ChliHug,

Unfortunately I cannot post the code, being production code and all. But I can post parts:

For example this is how I create my window and renderer:

void CreateWindowAndRender(int w, int h, int flag)

    SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 8);
    SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 8);
    SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 8);

    window = SDL_CreateWindow(NULL, x, y, w, h, flag);
    renderer = SDL_CreateRenderer(i_window, -1, SDL_RENDERER_ACCELERATED);
    if(renderer == nullptr)
        renderer = SDL_CreateRenderer(i_window, -1, SDL_RENDERER_SOFTWARE);
    SDL_SetWindowMinimumSize(i_window, 320, 480);

Here is how I handle the texture:

void Txtr::Resize(int _w, int _h) {
    bool    create_texture(texture==nullptr);
    int     w{0};
    int     h{0};

    if( texture != nullptr ) {
        SDL_QueryTexture(texture, NULL, NULL, &w, &h);
        if( w != _w || h != _h ) {
            create_texture = true;
    if( create_texture ) {
        if( texture != nullptr ) {
    if( _w>0 && _h>0 ) {
        auto width=_w;
#if defined(__WINRT__)
    texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_ABGR8888, SDL_TEXTUREACCESS_STREAMING, width, _h);
    SDL_SetTextureBlendMode(texture, SDL_BLENDMODE_BLEND);


Notes: Please don’t focus to much on the other stuff, I am copying and pasting pieces to show you what I am doing, there are checks and stuff missing this is the minimal version of whats happening.

Hello @ChliHug,

I have some more info if this is of any help. Apparently the rendering issues only happen if the graphics card is Nvidia.
It works fine for Intel, or AMD graphic cards. Is this a known issue?


It’s new to me. I’ll try to do some tests later this week, but I’m not sure if I can find anything.

Oh, can you also tell us how you update the texture data?

Hello ChilHug,

Thank you for the quick reply. I have class that represents a Pixel with RGBA value, I iterate through the date and set value. is that helpful?

On another note, the “Pixel” is short of designed with ABGR as pixel format target which we have been using so far with no problem. But now I see that ABGR on Nvidia graphics its not one of the supported pixel formats and when I try to set it to this one it becomes SDL_PIXELFORMAT_UKNOWN. But never the less shouldn’t software renderer work instead? that one does support the ABGR.


I was more curious how you use SDL_UpdateTexture.

Managed to do some testing (Windows 7, NVIDIA card, renderers: direct3d, direct3d11, opengl, opengles2, and software), but didn’t find any issues with the pixel formats.

Where and how did you get a SDL_PIXELFORMAT_UNKNOWN? I don’t think the renderers throw this at you without some critical error. Or maybe some garbage value was passed to SDL_GetPixelFormatName?


And thanks for your quick reply. I do not use SDL_UpdateTexture I am destroying the texture and create a new one if the width or hight changes. I changes my Colour class to support the ARGB8888 instead of the ABGR8888 that I was using, but it didn’t change anything same result.

You said windows 7, I have not problem with building for normal windows, all my other projects are ported to windows using the same rendering system because its shared code, and they work fine. The problem first appeared on UWP.

And at this point I don’t know if I am chasing a bug on my side, or its a UWP issue.


SDL_LockTexture then? Can’t be SDL_CreateTextureFromSurface according to your sample code.

Oh, I somehow thought, since you got it on the development machine, that the issue was outside of UWP. So maybe it’s in the winrt video driver, but that doesn’t seem to be able to create a window surface for the software renderer.

I only have Windows 8.1 with an Intel card lying around. Not sure if I can get my hands on Windows 10 to do further tests at this time.

Hello ChilHug,

I am using SDL_LockTexture, its a streaming texture.

SDL_LockTexture(texture, 0, (void **)(&pixels), &pitch);
//pixels is a class that that has the RGBA value, and pitch is just an int.

Furthermore here, (thats only happening on UWP) if lets say when I create the texture I give the width of 600 and the hight of 400, when I am locking the texture I am expecting to get back 240.000 pixels right? I only get 120.000 pixels half.

My problems also accuses if I am using software renderer that how I reproduced it the first time. If I use software renderer instead of direct3d11 I get the same results as on Nvidia.

How are you determining that you only get that? If SDL_LockTexture succeeds, you get a pointer to some memory of the appropriate size. In your example that should be 400 * pitch bytes. Are you saying it only updates half when you unlock or you get an access violation halfway through?


So after I get my pixels and I start to populate them with colour I do something like that:

for(auto j=0; j<height; j++)
    auto pixel_row(get_pixel_row(j);
    for(auto i=0; i<width; i++)
        pixel_row[i] = colour;

So I grab each row and populate it, In the beginning half way through that I was getting an access violation. Now if when I am creating the texture give the width*2 only for that part, then I get the right amount of pixels.

And again that is the case only for UWP, the shame rendering systems is being used through a couple other apps through multiple os’s and they work absolutely fine. This is the first time I see that.

Oh, wow. If the pixel format of the texture says it has 4 bytes per pixel and it only allocates for 2, then there’s obviously something wrong. Not sure where, though.

I know, thank you.

Can that be a bug in the library?

It’s possible, yes. We need to track down where it exactly fails.

I managed to get Windows 10 running on a machine. Now I have to convince it to compile SDL with the winrt driver for me. This may take a while.

Hello again,

and thank you very much for the help so far, its very much appreciated.