Den Fri, 1 May 2009 17:15:18 -0400
skrev Donny Viszneki <donny.viszneki at gmail.com>:
That, my friend, is where native resolution comes in. A monitor’s
native resolution is by definiton the perfect display mode for that
monitor, so it’s pretty much guaranteed to not be distorted in any
way,
But that’s insane. I would vastly favor any other option to solve this
problem over this one. It should definitely not be the default
behavior to use the “native resolution” and scale the picture
in-process. On performance- and/or power-constrained systems, this
sort of explosion of resources (for most video back-ends) is
ridiculous! To fix the problems of some users, SDL apps should not
use more resources for everyone.
But that’s the thing, there really aren’t any other options besides
telling users to constantly fix their broken systems, which may sound
fine to you, but really isn’t an option, specially since that may
involve buying new monitors and graphics cards, or installing alternate
operating systems. They don’t want that, they just want to play the
game.
Anyway, this is a very common problem, so it’s worth fixing imo. As for
performance, as we went over before, this should probably only be done
if hardware scaling is available.
In any case, I don’t think any developer should be FORCED to use this.
SDL could perhaps find the native resolution by querying EDID, or
simpler, just assume that the desktop resolution is the native one,
which is usually the case. The latter would also automatically work
with CRT monitors.
That is something I hadn’t considered. To me it seems very hackish. At
the very least, if you’re going to “just assume that the desktop
resolution is the native one” you had better provide an ENVIRONMENT
VARIABLE to override this behavior!!! Some people change their display
resolution frequently. Some even use those scrolling desktop modes
that uses a framebuffer larger than the display, and the video signal
output is assembled from a variable position in the framebuffer. On X
it’s a simple keyboard shortcut: CTRL+ALT+NPMINUS or CTRL+ALT+NPPLUS.
That’s virtual resolution. I’m talking about the true resolution.
Anyway, I’m not against having the ability to override with environment
variables, options are nice. It just REALLY shouldn’t be REQUIRED.
On that note: does anyone know how these modes interact with what
modes show up when querying available display modes?
assuming the monitor uses square pixels (a few don’t, but I don’t
think there’s any way to find out this, so let’s just ignore that).
You can query for the resolution tuple (MSWindows was once
particularly fond of 72x72 DPI for some reason) and get the info from
there.
Yes, but this info is frequently wrong.
What does 1.3 have to do with environment variables?
1.2 API is frozen, 1.3 API isn’t.
There is not a lot of development work happening for 1.2, but I
believe an API freeze implies one must retain backward compatibility
with any program using the old API. I do not suppose this includes
adding new API functions or environment variables. If Sam did
intend to freeze adding new API functions, I would not assume this
means environment variables are allowed.
I guess that depends on if you consider environment variables part of
the API. I don’t. If a game has to set environment variables to
configure SDL, a real API would be better. As I see it, environment
variables are for the users’ convenince, so they can override default
behaviour if something goes wrong in SDL. Not for developers.
(I don’t meant to contradict every individual thing you say!)
Ditto =)
At the minimum, SDL could allow you to specify logical resolution
and physical resolution separately.
Well, unless you only want boxing used, you will also need to specify
scaling options.
But hey, what’s this? We agree on something? =)