IOS 8 progress

Wait, what? Why would you do that?On 01.12.2014 03:22, Alex Szpakowski wrote:

I haven?t changed any public SDL APIs compared to the current SDL
2.0.4 / mercurial version. The biggest non-bugfix change to
functionality is that SDL_CreateWindow now only enables
Retina-resolution if the SDL_WINDOW_ALLOW_HIGHDPI flag is used, and
once it?s enabled you?ll need to use SDL_GL_GetDrawableSize (or
SDL_GetRendererOutputSize if you?re using the SDL_Render API) to get
the size of the drawable in pixels. SDL_GetWindowSize and the active
video display modes now always give their sizes in ?points? rather
than pixels.


Rainer Deyke (rainerd at eldwood.com)

A few reasons:

  • It matches SDL?s behaviour in OS X (and presumably Windows and Linux, once high-dpi support is implemented in those backends.)
  • It lets you render at a high resolution while keeping the physical size of what you render consistent with what users expect on their device. In other words you can easily use the ?dpi scale? of the device. This is actually really useful.
  • You can save performance on older / lower-end retina iOS devices by disabling high-dpi awareness at runtime. If you?re using everything correctly, doing so will just make things look a bit less sharp to users while saving a ton of memory and GPU pixel processing performance.

There was some discussion about what SDL?s high-dpi API would be, a year or so ago: https://bugzilla.libsdl.org/show_bug.cgi?id=1934On Dec 1, 2014, at 4:48 AM, Rainer Deyke wrote:

Wait, what? Why would you do that?


Rainer Deyke (rainerd at eldwood.com)

Hi Alex,

This might cause issues where SDL is already implemented as back-end as it
deviates from the original behaviour. I wonder if such change would be
accepted on the main branch. I am a bit confused when we call SDL on Iphone
it just opens a a windows with resolution x y no matter if it is retina or
not. Is it possible to have preprocessor macro for the old behaviour ?On Mon, Dec 1, 2014 at 9:01 AM, Alex Szpakowski wrote:

A few reasons:

  • It matches SDL?s behaviour in OS X (and presumably Windows and Linux,
    once high-dpi support is implemented in those backends.)
  • It lets you render at a high resolution while keeping the physical size
    of what you render consistent with what users expect on their device. In
    other words you can easily use the ?dpi scale? of the device. This is
    actually really useful.
  • You can save performance on older / lower-end retina iOS devices by
    disabling high-dpi awareness at runtime. If you?re using everything
    correctly, doing so will just make things look a bit less sharp to users
    while saving a ton of memory and GPU pixel processing performance.

There was some discussion about what SDL?s high-dpi API would be, a year
or so ago: https://bugzilla.libsdl.org/show_bug.cgi?id=1934

On Dec 1, 2014, at 4:48 AM, Rainer Deyke wrote:

Wait, what? Why would you do that?


Rainer Deyke (rainerd at eldwood.com)


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

  • It matches SDL?s behaviour in OS X (and presumably Windows and
    Linux, once high-dpi support is implemented in those backends.)
  • It lets you render at a high resolution while keeping the physical
    size of what you render consistent with what users expect on their
    device. In other words you can easily use the ?dpi scale? of the
    device. This is actually really useful.
  • You can save performance on older / lower-end retina iOS devices by
    disabling high-dpi awareness at runtime. If you?re using everything
    correctly, doing so will just make things look a bit less sharp to
    users while saving a ton of memory and GPU pixel processing
    performance.

There was some discussion about what SDL?s high-dpi API would be, a
year or so ago: https://bugzilla.libsdl.org/show_bug.cgi?id=1934

(Not directed to you in particular.) Sorry, but that’s terrible, and
Bugzilla is a terrible place for these kinds of discussions. There are
two use cases for SDL:

  • I have a game that runs at a fixed resolution. In that case, I
    only care about the logical resolution of the game, i.e. the size of the
    backing store. Suppose I request a 4000x3000 window. I don’t care if
    this is a full 4096x3072 screen or a 8192x6144 screen scaled up x2 or a
    16384x12288 screen scaled up x4 or even a 1280x1024 screen scaled down
    x0.25.

  • I have a game that runs at an arbitrary resolution. In that case I
    am already taking responsibility for scaling on my end, so I just want
    the biggest, most high-dpi window that I can get.

In addition, the high-dpi API has the following flaws:

  • It treats high-dpi as a binary switch: either you have it or you
    don’t. Suppose I had a device with a logical resolution of 320x200 and
    a physical resolution of 1280x800. I would now be able to request
    either a 320x200 window (scaled x4) or a 1280x800 window (scaled x1),
    but not an intermediate 640x480 window (scaled x2).

  • I have to call SDL_GL_GetDrawableSize to get the real size of the
    back buffer even though I may not be using the OpenGL API (or even an
    OpenGL backend).

  • It is basically undocumented.

Now, to address your points one by one:

  • It matches SDL?s behaviour in OS X (and presumably Windows and
    Linux, once high-dpi support is implemented in those backends.)

Consistency is good, but I really hope the high-dpi API gets
fixed/replaced/ditched before it becomes too entrenched to change.

  • It lets you render at a high resolution while keeping the physical
    size of what you render consistent with what users expect on their
    device. In other words you can easily use the ?dpi scale? of the
    device. This is actually really useful.

This would be a valid argument if the “logical pixel” size were
consistent between devices. On PCs, the “logical resolution” can vary
from 640x480 to well over twice, all without “high-dpi” mode, and that’s
not even taking into account the different monitor sizes. And mobile
ports of the PC game.

If you really want to do this, a SDL_GetDPIScale function would work better.

  • You can save performance on older / lower-end retina iOS devices by
    disabling high-dpi awareness at runtime. If you?re using everything
    correctly, doing so will just make things look a bit less sharp to
    users while saving a ton of memory and GPU pixel processing
    performance.

That’s an argument in favor of adding arbitrary up-scaling to SDL, not
the current high-dpi API. For example, I have a PC game that runs at
640x480. If the screen resolution is at least 1280x960, I want to scale
the game up x2. However, 1280x960 is not considered a high-dpi
resolution on PCs, so high-dpi scaling is useless to me. If I want to
run the same game on an iPhone 4+, I need to suddenly need to use the
high-dpi API in SDL because my base resolution of 640x480 is a high-dpi
resolution on iPhones.

Here is what my code would have to look like under the current API:

if desired_resolution >= desktop_resolution:
window_size = desired_resolution / 2
create_window(window_size)
actual_resolution = SDL_GL_GetDrawableSize()
scale_factor = actual_resolution / window_size
if scale_factor != 2:
close_window()
window_size = desired_resolution / scale_factor
create_window(window_size)
else:
scale_factor = desktop_resolution / desired_resolution
create_window(window_size * scale_factor)
actual_resolution = SDL_GL_GetDrawableSize()
scale_factor = actual_resolution / desired_resolution
proceed_to_run_game_at(scale_factor)

…and even that doesn’t handle all of the edge cases. Here is what my
code should look like:

scale_factor = desktop_resolution / desired_resolution
create_window(desired_resolution, scale_factor)
proceed_to_run_game_without_caring_about_scaling()

(Also, if you want to go to a lower resolution to save performance, you
should really also load your textures at a lower resolution. It’s not
as simple as flipping a high-dpi switch.)On 01.12.2014 10:01, Alex Szpakowski wrote:


Rainer Deyke (rainerd at eldwood.com)

This might cause issues where SDL is already implemented as back-end as it deviates from the original behaviour.

I agree ? I?ve asked Sam in the other thread whether the change is acceptable for an SDL point release.

I am a bit confused when we call SDL on Iphone it just opens a a windows with resolution x y no matter if it is retina or not.

I can illustrate with some screenshots of a modified version of one of the SDL iOS test apps.

Here it is running on an iPad 2: http://i.imgur.com/pH2ubIv.png
The iPad 2 doesn’t have a high-dpi / Retina screen, and has a resolution of 768x1024.

Here?s the same test using the old SDL 2.0.3 way of doing things on iOS, running on an iPad Air: http://i.imgur.com/YNPgFIW.png
The iPad Air has the same physical screen size as the iPad 2, but it has a Retina resolution of 1536x2048 (I?ve scaled down the image by 50% so it?s easier to compare with the iPad 2.)

Here?s the same test using my changes with high dpi-aware code, running on the iPad Air again: http://i.imgur.com/7Ln3hNX.png
(I?ve also scaled down that image.)

If you only go by pixels without accounting for the retina DPI scaling, then at least your interface (and probably your whole game if it?s 2D) will look tiny on modern iOS devices and normal-sized on older ones.

If you use the SDL_Render API you can use SDL_RendererSetLogicalSize, but that?s not quite the same as this (and they can work in combination), especially since it pretends that a 3.5? iPhone and a 10? iPad have the same size.

There are two use cases for SDL: […]

FWIW there are many more use cases. For many people SDL is just used as an API to abstract windowing and input across multiple platforms. Since this is a feature of OS windowing APIs, it makes sense to expose it in an abstracted manner.

In addition, the high-dpi API has the following flaws:

  • It treats high-dpi as a binary switch: either you have it or you don’t. Suppose I had a device with a logical resolution of 320x200 and a physical resolution of 1280x800. I would now be able to request either a 320x200 window (scaled x4) or a 1280x800 window (scaled x1), but not an intermediate 640x480 window (scaled x2).

I think you?re misunderstanding the purpose of the operating system-level high-DPI scaling (which this is.) In OS X it is a binary switch that apps do have to opt-in to get. On iOS it?s essentially a binary switch as well (if you want the OS to use the fast path for compositing when high-dpi mode is enabled.)

Apple?s documentation elaborates:
https://developer.apple.com/library/mac/documentation/GraphicsAnimation/Conceptual/HighResolutionOSX/Explained/Explained.html
https://developer.apple.com/library/mac/documentation/GraphicsAnimation/Conceptual/HighResolutionOSX/CapturingScreenContents/CapturingScreenContents.html

  • I have to call SDL_GL_GetDrawableSize to get the real size of the back buffer even though I may not be using the OpenGL API (or even an OpenGL backend).

You can use SDL_GetRendererOutputSize if you?re using the SDL_Render API. If you aren?t using SDL_GL or SDL_Render at all then there should be no issue, since those are the only ways that SDL exposes a graphics drawable that can have high-dpi mode enabled.

  • It lets you render at a high resolution while keeping the physical
    size of what you render consistent with what users expect on their
    device. In other words you can easily use the ?dpi scale? of the
    device. This is actually really useful.

This would be a valid argument if the “logical pixel” size were consistent between devices. On PCs, the “logical resolution” can vary from 640x480 to well over twice, all without “high-dpi” mode, and that’s not even taking into account the different monitor sizes. And mobile ports of the PC game.

A major reason that the operating system APIs let you toggle high-dpi mode and convert between coordinate spaces is that it?s specifically designed to make things a higher resolution and pixel density while keeping the same ?logical? scale as you would have if high-dpi were disabled. A major reason for disabling it is that, for games in particular, the number of pixels to process and the memory usage of buffers and textures can simply be too high on some systems.

That actually wasn?t on this mailing list, my bad! In any case it?s been asked, and if it turns out that the change is too big (or completely unwanted) then I?ll revert that part of my branch to SDL 2.0.3?s behaviour.On Dec 1, 2014, at 6:56 AM, Alex Szpakowski <@Alex_Szpakowski> wrote:

I?ve asked Sam in the other thread whether the change is acceptable for an SDL point release.

I have done this for Ogre3d where my contentScaleFactor matches the screen
bounds. There could be be another multiplier to tweak the ratio between the
two. Is that something similar to what you are doing or I am missing
something ?

MyView* myView = [[MyView alloc]
initWithFrame:CGRectMake(0,0,width,height)]; myView->viewcontroller =
GetSDLViewController(sdlWindow);
[(GetSDLViewController(sdlWindow)).view addSubview:myView];
myView.contentScaleFactor = [UIScreen mainScreen].scale;On Mon, Dec 1, 2014 at 11:36 AM, Alex Szpakowski wrote:

That actually wasn?t on this mailing list, my bad! In any case it?s been
asked, and if it turns out that the change is too big (or completely
unwanted) then I?ll revert that part of my branch to SDL 2.0.3?s behaviour.

On Dec 1, 2014, at 6:56 AM, Alex Szpakowski wrote:

I?ve asked Sam in the other thread whether the change is acceptable for
an SDL point release.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

There are two use cases for SDL: […]

FWIW there are many more use cases. For many people SDL is just used
as an API to abstract windowing and input across multiple platforms.

Yeah, but pretty much all of them fit either into the "fixed resolution"
or the “variable resolution” camps.

In addition, the high-dpi API has the following flaws:

  • It treats high-dpi as a binary switch: either you have it or you
    don’t. Suppose I had a device with a logical resolution of 320x200
    and a physical resolution of 1280x800. I would now be able to
    request either a 320x200 window (scaled x4) or a 1280x800 window
    (scaled x1), but not an intermediate 640x480 window (scaled x2).

I think you?re misunderstanding the purpose of the operating
system-level high-DPI scaling (which this is.) In OS X it is a
binary switch that apps do have to opt-in to get. On iOS it?s
essentially a binary switch as well (if you want the OS to use the
fast path for compositing when high-dpi mode is enabled.)

Yes, right now under these particular operating systems that’s the case,
but Apple has never been particularly good about forward and backward
compatibility. So what happens if Apple decides that the iPhone 10 need
a 652 dpi “double retina” screen? Suddenly your binary switch has three
possible states. What happens when you port your game from mobile to
Mac and your high-dpi game is now restricted to a tiny window on a huge
5120x2880 screen? What happens when a random future operating system
decides that “high-dpi” has not twice but three times the base resolution?

IMO, the purpose of SDL is to isolate the programmer from these kinds of
low-level details.

  • I have to call SDL_GL_GetDrawableSize to get the real size of the
    back buffer even though I may not be using the OpenGL API (or even
    an OpenGL backend).

You can use SDL_GetRendererOutputSize if you?re using the SDL_Render
API. If you aren?t using SDL_GL or SDL_Render at all then there
should be no issue, since those are the only ways that SDL exposes a
graphics drawable that can have high-dpi mode enabled.

There’s also SDL_GetWindowSurface, although that carries the screen
resolution along with it. That’s two-and-a-half different functions for
getting the same information.On 01.12.2014 12:15, Alex Szpakowski wrote:


Rainer Deyke (rainerd at eldwood.com)

I think the actual real problem in this discussion is that we keep
talking about like we’re using pixels when these days operating
systems don’t do that anymore (and SDL2 fails to reflect this).

That said, I guess it’d be nice to get an easy way to get a window
size in pixels before even creating one, just for the sake of being
able to implement a resolution list. But I suppose that people will
argue that in fullscreen you should use the native resolution, and
that in windowed there isn’t any reason to restrict to arbitrary sizes
(it still screws up if you want to allow only integer-sized scaling,
i.e. if the program uses pixelart).

Rainer Deyke wrote:> On 01.12.2014 12:15, Alex Szpakowski wrote:

Yes, right now under these particular operating systems that’s the case,
but Apple has never been particularly good about forward and backward
compatibility. So what happens if Apple decides that the iPhone 10 need
a 652 dpi “double retina” screen? Suddenly your binary switch has three
possible states. What happens when you port your game from mobile to
Mac and your high-dpi game is now restricted to a tiny window on a huge
5120x2880 screen? What happens when a random future operating system
decides that “high-dpi” has not twice but three times the base resolution?

IMO, the purpose of SDL is to isolate the programmer from these kinds of
low-level details.

Rainer Deyke (rainerd at eldwood.com)

Yeah, the current way SDL handles the contentscale factor breaks each time Apple updates iOS, is doesnt seem very future proof.


Rodrigo Cardoso Rocha
@RodrigoRodrigoR - twitter.com/RodrigoRodrigoR
Chibata Creations - chibatacreations.com

Yes, right now under these particular operating systems that’s the case, but Apple has never been particularly good about forward and backward compatibility. So what happens if Apple decides that the iPhone 10 need a 652 dpi “double retina” screen? Suddenly your binary switch has three possible states. What happens when you port your game from mobile to Mac and your high-dpi game is now restricted to a tiny window on a huge 5120x2880 screen? What happens when a random future operating system decides that “high-dpi” has not twice but three times the base resolution?

IMO, the purpose of SDL is to isolate the programmer from these kinds of low-level details.

The iPhone 6 Plus has a ?3x? Retina display. The SDL (and iOS) API handles it with no problem. You aren?t expected to hard-code 2x scale into your app ? in fact, the high-DPI APIs make it easier to avoid hard-coding that kind of thing.

Again, one of the main purposes of SDL is to provide a single API that covers the useful functionality of the windowing APIs it supports.On Dec 1, 2014, at 11:25 AM, Rainer Deyke wrote:

Well, speaking from experience SDL?s code for handling the content scale factor has needed updating much less than almost every other part of the UIKit backend code? rotation/orientation changes, deprecated frameworks (e.g. UIAccelerometer versus CoreMotion), launch screen changes, and interface API changes like the code for determining status bar visibility have all been much bigger issues in the codebase.

In fact, SDL?s code for handling the content scale factor hasn?t needed updating at all, aside from adding support for the new ?native scale? so that the iPhone 6 Plus uses a pixel-perfect 1080x1920 rather than downscaled 1242x2208. The other changes I made to that area of the code were to make it behave the same way as SDL on OS X does.On Dec 1, 2014, at 2:16 PM, RodrigoCard wrote:

Yeah, the current way SDL handles the contentscale factor breaks each time Apple updates iOS, is doesnt seem very future proof.

Rodrigo Cardoso Rocha
@RodrigoRodrigoR - twitter.com/RodrigoRodrigoR
Chibata Creations - chibatacreations.com


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

I think that would be useful, yeah. Maybe the API could be something like: SDL_GetDisplayModePixelSize(const SDL_DisplayMode *mode, int *w, int *h) (maybe with a better function name.)On Dec 1, 2014, at 12:00 PM, Sik the hedgehog <sik.the.hedgehog at gmail.com> wrote:

That said, I guess it’d be nice to get an easy way to get a window
size in pixels before even creating one, just for the sake of being
able to implement a resolution list. But I suppose that people will
argue that in fullscreen you should use the native resolution, and
that in windowed there isn’t any reason to restrict to arbitrary sizes
(it still screws up if you want to allow only integer-sized scaling,
i.e. if the program uses pixelart).


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Why not just to specify if you want 1:1 resolution versus content-scale
factor ? If you want content-scale factor just specify the factor .On Mon, Dec 1, 2014 at 9:21 PM, Alex Szpakowski wrote:

I think that would be useful, yeah. Maybe the API could be something like:
SDL_GetDisplayModePixelSize(const SDL_DisplayMode *mode, int *w, int *h)
(maybe with a better function name.)

On Dec 1, 2014, at 12:00 PM, Sik the hedgehog <sik.the.hedgehog at gmail.com> wrote:

That said, I guess it’d be nice to get an easy way to get a window
size in pixels before even creating one, just for the sake of being
able to implement a resolution list. But I suppose that people will
argue that in fullscreen you should use the native resolution, and
that in windowed there isn’t any reason to restrict to arbitrary sizes
(it still screws up if you want to allow only integer-sized scaling,
i.e. if the program uses pixelart).


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

That?s what the SDL_WINDOW_ALLOW_HIGHDPI flag does (on platforms where it?s implemented.) It can still be useful to know if a display mode on a monitor supports that in the first place before that display mode is used or a window is created, though.On Dec 1, 2014, at 6:36 PM, Alexander Chaliovski wrote:

Why not just to specify if you want 1:1 resolution versus content-scale factor ? If you want content-scale factor just specify the factor .


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

  • It treats high-dpi as a binary switch: either you have it or you
    don’t. Suppose I had a device with a logical resolution of 320x200 and
    a physical resolution of 1280x800. I would now be able to request
    either a 320x200 window (scaled x4) or a 1280x800 window (scaled x1),
    but not an intermediate 640x480 window (scaled x2).

I think that flag is (and has always been) very misunderstood.

It basically says “give me all the pixels the hardware has, not what the
OS is telling apps it has while actually scaling them,” and we have
other APIs that let you get the finer details if you want them. The
original intention was to deal with Mac OS X setting the desktop on
Retina MacBooks to 1440x900 when the hardware was capable of 2880x1800.

The API doesn’t care anything about how much you are scaling, it’s just
designed to prevent you from being locked out of real physical pixels if
you want them (not all apps do!), which was a real problem for
FULLSCREEN_DESKTOP apps.

  • I have to call SDL_GL_GetDrawableSize to get the real size of the
    back buffer even though I may not be using the OpenGL API (or even an
    OpenGL backend).

That’s a solvable problem.

  • It is basically undocumented.

That’s definitely a solvable problem.

–ryan.

That?s what the SDL_WINDOW_ALLOW_HIGHDPI flag does (on platforms
where it?s implemented.) It can still be useful to know if a display
mode on a monitor supports that in the first place before that
display mode is used or a window is created, though.

Why not just to specify if you want 1:1 resolution versus
content-scale factor ? If you want content-scale factor just
specify the factor .

No, it doesn’t. SDL_WINDOW_ALLOW_HIGHDPI implements a fixed,
platform-dependent scaling factor. What I want is an arbitrary scaling
factor, specified by the user, implemented in a platform-independent manner.On 01.12.2014 23:38, Alex Szpakowski wrote:

On Dec 1, 2014, at 6:36 PM, Alexander Chaliovski wrote:


Rainer Deyke (rainerd at eldwood.com)

  • It treats high-dpi as a binary switch: either you have it or you
    don’t. Suppose I had a device with a logical resolution of 320x200 and
    a physical resolution of 1280x800. I would now be able to request
    either a 320x200 window (scaled x4) or a 1280x800 window (scaled x1),
    but not an intermediate 640x480 window (scaled x2).

I think that flag is (and has always been) very misunderstood.

No, I understand it all right. I just think it’s crap.

It basically says “give me all the pixels the hardware has, not what the
OS is telling apps it has while actually scaling them,” and we have
other APIs that let you get the finer details if you want them.

That should be the default. It’s completely insane that this isn’t the
default. When I ask SDL for a 4000x3000 window, I want exactly
4000x3000 actual physical pixels. When I ask SDL for the desktop
resolution, I want it in actual physical pixels. When I read mouse
movement events, I want them in actual physical pixels. I have a
14-year-old multiple-fixed-resolution game that supports 2560x1920,
written before high-dpi was a thing. I want to be able to run the game
at its full resolution without having to code around the insanity of the
modern operating systems.

The API doesn’t care anything about how much you are scaling, it’s just
designed to prevent you from being locked out of real physical pixels if
you want them (not all apps do!),

Yes, automatic scaling is a useful feature, but that has nothing to do
with high-dpi modes. It’s a useful feature that 4096x2160 physical
pixels, it’s a useful feature at 1280x1024 physical pixels, it’s even a
useful feature at 320x200 physical pixels. Moreover, the scaling factor
I want to use is completely independent from the scaling factor that the
operating system wants me to use. For a game written for a fixed
resolution, I want the highest scaling factor that fits on the screen,
which is likely to be an odd number. For a game written for arbitrary
resolutions, I want to give the user full control over the scaling
factor so that she can choose the best trade-off between resolution and
performance. In no case does the operating system’s default scaling
factor even influence my choice.On 02.12.2014 07:44, Ryan C. Gordon wrote:


Rainer Deyke (rainerd at eldwood.com)

Rainer Deyke writes:

  • It treats high-dpi as a binary switch: either you have it or you
    don’t. Suppose I had a device with a logical resolution of 320x200 and
    a physical resolution of 1280x800. I would now be able to request
    either a 320x200 window (scaled x4) or a 1280x800 window (scaled x1),
    but not an intermediate 640x480 window (scaled x2).

I think that flag is (and has always been) very misunderstood.

No, I understand it all right. I just think it’s crap.

[…]

The API doesn’t care anything about how much you are scaling, it’s just
designed to prevent you from being locked out of real physical pixels if
you want them (not all apps do!),

Yes, automatic scaling is a useful feature, but that has nothing to do
with high-dpi modes. It’s a useful feature that 4096x2160 physical
pixels, it’s a useful feature at 1280x1024 physical pixels, it’s even a
useful feature at 320x200 physical pixels. Moreover, the scaling factor
I want to use is completely independent from the scaling factor that the
operating system wants me to use. For a game written for a fixed
resolution, I want the highest scaling factor that fits on the screen,
which is likely to be an odd number. For a game written for arbitrary
resolutions, I want to give the user full control over the scaling
factor so that she can choose the best trade-off between resolution and
performance. In no case does the operating system’s default scaling
factor even influence my choice.

It sounds like you don’t understand what the high-dpi flag means (or
"dpi aware" as windows calls it, I believe). When an application
requests the high-dpi mode it is promising that it will adjust its
display according to the actual resolution of the screen rather than
assuming it is 72 or 96 dpi. If the application does not make this
promise, chances are it has designed its UI elements to have a sensible
size at 72 or 96 dpi. You can imagine what that does to usability at
much higher resolutions.

So yes, it is a binary switch: Either you deal correctly with
high-resolution modes, or you don’t. If you want specific scaling
factors, you’ll have to handle it yourself. If you want full access to
the physical pixels, enable high-dpi mode. If you just want to be able
to continue drawing stuff as if all screens are roughly the same old
resolution, keep the high-dpi mode off and things will keep working as
expected.

In short: non-high-dpi mode is a compatibility trick to make typical
applications work well with high-resolution screens. It is not a feature
to request automatic scaling of graphics.

eirik> On 02.12.2014 07:44, Ryan C. Gordon wrote: