On a HIDPI display on macOS (and using the SDL_WINDOW_ALLOW_HIGHDPI flag), SDL window/display functions seem to always work with logical display sizes rather than “real” ones, and I can’t seem to find a way to determine the display scaling factor?
I know about SDL_GetRendererOutputSize(), but that’s only after the renderer is created which is too late, I need to know before creating the window, so I can create it at the right size right away.
Or is there any way to tell SDL to create a window of a given size in actual pixels, rather than logical display units?
I want to achieve pixel-perfect scaling in my app, and this is really messing up what I’m trying to do (on Mac, at least). As it is the only way I can think of is to create a dummy window and renderer so I can compare SDL_GetRendererOutputSize() with the window’s requested size, and then resize accordingly, but that would be an ugly kludge…