- It matches SDL?s behaviour in OS X (and presumably Windows and
Linux, once high-dpi support is implemented in those backends.)
- It lets you render at a high resolution while keeping the physical
size of what you render consistent with what users expect on their
device. In other words you can easily use the ?dpi scale? of the
device. This is actually really useful.
- You can save performance on older / lower-end retina iOS devices by
disabling high-dpi awareness at runtime. If you?re using everything
correctly, doing so will just make things look a bit less sharp to
users while saving a ton of memory and GPU pixel processing
performance.
There was some discussion about what SDL?s high-dpi API would be, a
year or so ago: https://bugzilla.libsdl.org/show_bug.cgi?id=1934
(Not directed to you in particular.) Sorry, but that’s terrible, and
Bugzilla is a terrible place for these kinds of discussions. There are
two use cases for SDL:
-
I have a game that runs at a fixed resolution. In that case, I
only care about the logical resolution of the game, i.e. the size of the
backing store. Suppose I request a 4000x3000 window. I don’t care if
this is a full 4096x3072 screen or a 8192x6144 screen scaled up x2 or a
16384x12288 screen scaled up x4 or even a 1280x1024 screen scaled down
x0.25.
-
I have a game that runs at an arbitrary resolution. In that case I
am already taking responsibility for scaling on my end, so I just want
the biggest, most high-dpi window that I can get.
In addition, the high-dpi API has the following flaws:
-
It treats high-dpi as a binary switch: either you have it or you
don’t. Suppose I had a device with a logical resolution of 320x200 and
a physical resolution of 1280x800. I would now be able to request
either a 320x200 window (scaled x4) or a 1280x800 window (scaled x1),
but not an intermediate 640x480 window (scaled x2).
-
I have to call SDL_GL_GetDrawableSize to get the real size of the
back buffer even though I may not be using the OpenGL API (or even an
OpenGL backend).
-
It is basically undocumented.
Now, to address your points one by one:
- It matches SDL?s behaviour in OS X (and presumably Windows and
Linux, once high-dpi support is implemented in those backends.)
Consistency is good, but I really hope the high-dpi API gets
fixed/replaced/ditched before it becomes too entrenched to change.
- It lets you render at a high resolution while keeping the physical
size of what you render consistent with what users expect on their
device. In other words you can easily use the ?dpi scale? of the
device. This is actually really useful.
This would be a valid argument if the “logical pixel” size were
consistent between devices. On PCs, the “logical resolution” can vary
from 640x480 to well over twice, all without “high-dpi” mode, and that’s
not even taking into account the different monitor sizes. And mobile
ports of the PC game.
If you really want to do this, a SDL_GetDPIScale function would work better.
- You can save performance on older / lower-end retina iOS devices by
disabling high-dpi awareness at runtime. If you?re using everything
correctly, doing so will just make things look a bit less sharp to
users while saving a ton of memory and GPU pixel processing
performance.
That’s an argument in favor of adding arbitrary up-scaling to SDL, not
the current high-dpi API. For example, I have a PC game that runs at
640x480. If the screen resolution is at least 1280x960, I want to scale
the game up x2. However, 1280x960 is not considered a high-dpi
resolution on PCs, so high-dpi scaling is useless to me. If I want to
run the same game on an iPhone 4+, I need to suddenly need to use the
high-dpi API in SDL because my base resolution of 640x480 is a high-dpi
resolution on iPhones.
Here is what my code would have to look like under the current API:
if desired_resolution >= desktop_resolution:
window_size = desired_resolution / 2
create_window(window_size)
actual_resolution = SDL_GL_GetDrawableSize()
scale_factor = actual_resolution / window_size
if scale_factor != 2:
close_window()
window_size = desired_resolution / scale_factor
create_window(window_size)
else:
scale_factor = desktop_resolution / desired_resolution
create_window(window_size * scale_factor)
actual_resolution = SDL_GL_GetDrawableSize()
scale_factor = actual_resolution / desired_resolution
proceed_to_run_game_at(scale_factor)
…and even that doesn’t handle all of the edge cases. Here is what my
code should look like:
scale_factor = desktop_resolution / desired_resolution
create_window(desired_resolution, scale_factor)
proceed_to_run_game_without_caring_about_scaling()
(Also, if you want to go to a lower resolution to save performance, you
should really also load your textures at a lower resolution. It’s not
as simple as flipping a high-dpi switch.)On 01.12.2014 10:01, Alex Szpakowski wrote:
–
Rainer Deyke (rainerd at eldwood.com)