Or is there another way around this that I should be aware of?
If it is bug #3147 then the workaround is to ensure that you choose a ‘supported’ pixel format for your renderer target. For example you can call SDL_GetRendererInfo and iterate through the texture formats to find which are suitable.
How can I tell if a pixel format is supported or not?
As I said, one way is to call SDL_GetRendererInfo and iterate through the returned texture_formats comparing them with your preferred option(s).
A more drastic solution might be to switch to using the OpenGL renderer, but of course that won’t help if you are targeting WinRT/UWP:
I’m picking that OpenGL is widespread enough for this fix to work.
OpenGL is universally available on ‘desktop’ Windows AFAIK, but whether it will work in a trouble-free fashion for you depends on your app playing by all the rules. I developed my first SDL app on Windows/D3D and that was a mistake as it turned out because porting it to other platforms (which use OpenGL for rendering) showed up incompatibilities with the way I was doing things. Now I use OpenGL even on Windows (like you I’m not interested in WinRT/UWP).