On a HIDPI display on macOS (and using the SDL_WINDOW_ALLOW_HIGHDPI flag), SDL window/display functions seem to always work with logical display sizes rather than “real” ones, and I can’t seem to find a way to determine the display scaling factor?
I know about SDL_GetRendererOutputSize(), but that’s only after the renderer is created which is too late, I need to know before creating the window, so I can create it at the right size right away.
Or is there any way to tell SDL to create a window of a given size in actual pixels, rather than logical display units?
I want to achieve pixel-perfect scaling in my app, and this is really messing up what I’m trying to do (on Mac, at least). As it is the only way I can think of is to create a dummy window and renderer so I can compare SDL_GetRendererOutputSize() with the window’s requested size, and then resize accordingly, but that would be an ugly kludge…
I don’t know whether this is the ‘official’ way but comparing the values returned from
SDL_GL_GetDrawableSize() seems to work for me.
That’s just the OpenGL version of what I already said. Either way you’re still not getting the scaling factor prior to creating the window and rendering context, which is what I’m looking for here.
The difference is that my method doesn’t require a renderer to be created. Yes it is necessary to have a window, but creating a ‘dummy’ window to discover the scaling has less overhead than creating a window and renderer.
Fair enough, but if one is only using the 2D renderer for their app this isn’t really a better solution, it just adds a superfluous explicit dependency to OpenGL, which is going to fail on a platform that doesn’t have OpenGL (or will no longer have it in the future, which will happen on Mac eventually). But if you are specifically doing OpenGL in your app, then yes this is definitely the better thing to do.
In any case I did implement what I said in my initial post and it works fine now and is basically unnoticeable to the end user, so I’m good for now.