SDL_Window suggestion

Since SDL now has built-in support for scaling, it would be nice if we could set up a way to do it automatically. Another framework I’ve worked with in the past had a concept of separate physical and logical dimensions, and it was very useful. It let you define a constant resolution for your game engine to draw to and preserved that independent of changes to your window size.

Any way we could get this added to SDL 1.3?

Den Mon, 20 Apr 2009 06:11:38 -0700 (PDT)
skrev Mason Wheeler :

Since SDL now has built-in support for scaling, it would be nice if
we could set up a way to do it automatically. Another framework I’ve
worked with in the past had a concept of separate physical and
logical dimensions, and it was very useful. It let you define a
constant resolution for your game engine to draw to and preserved
that independent of changes to your window size.

Any way we could get this added to SDL 1.3?

This would be great. Specially since most LCD monitors have this
unfortunate tendency to stretch fullscreen modes to the entire screen
while ignoring aspect ratio, so for fixed-resolution stuff you pretty
much have to add black bars yourself to keep everything from getting
distorted. For pixel graphics, having the option of integer only
zoom factors would also be nice.

  • Gerry

Den Mon, 20 Apr 2009 06:11:38 -0700 (PDT)
skrev Mason Wheeler :

Since SDL now has built-in support for scaling, it would be nice if
we could set up a way to do it automatically. ?Another framework I’ve
worked with in the past had a concept of separate physical and
logical dimensions, and it was very useful. ?It let you define a
constant resolution for your game engine to draw to and preserved
that independent of changes to your window size.

Any way we could get this added to SDL 1.3?

SDL used to focus on accelerated “blitting” operations which more or
less just copy data, sometimes combining it with other data. Now SDL
has shifted its focus to rasterizing/resampling operations. Both of
these methods are designed around what the devices of the day are
capable of doing.

Keeping in mind that SDL does not endeavor to become a “everything but
the kitchen sink” package, I believe it would be a good proposition
for SDL to support transforming incoming rects/coords for drawing
operations by some arbitrary scaling metric specified by some API like
"SDL_SetScale()" or “SDL_SetVirtualDimensions().” This is somewhat
analogous to OpenGL’s transformation matrices.

This would be great. ?Specially since most LCD monitors have this
unfortunate tendency to stretch fullscreen modes to the entire screen
while ignoring aspect ratio, so for fixed-resolution stuff you pretty
much have to add black bars yourself to keep everything from getting
distorted. ?For pixel graphics, having the option of integer only
zoom factors would also be nice.

This is really an unrelated issue, both from SDL’s scaling abilities
and from Gerry’s suggestion.

The problem you’re having is that neither will your display nor your
GPU / OS “letterbox” your picture.

SDL already has support for automatic letterboxing IIRC, however your
GPU / OS must not purport to support a good unboxed display resolution
is available!On Wed, Apr 29, 2009 at 9:59 PM, Gerry JJ wrote:


http://codebad.com/

Den Wed, 29 Apr 2009 22:26:13 -0400
skrev Donny Viszneki <donny.viszneki at gmail.com>:

This would be great. ?Specially since most LCD monitors have this
unfortunate tendency to stretch fullscreen modes to the entire
screen while ignoring aspect ratio, so for fixed-resolution stuff
you pretty much have to add black bars yourself to keep everything
from getting distorted. ?For pixel graphics, having the option of
integer only zoom factors would also be nice.

This is really an unrelated issue, both from SDL’s scaling abilities
and from Gerry’s suggestion.

The problem you’re having is that neither will your display nor your
GPU / OS “letterbox” your picture.

Letterbox, pillowbox, or in the case of integer-scaled pixel graphics,
possibly both. In any case, it’s a very common problem, and this
isn’t just about me. I doubt I’m the only one that wants their
games to display properly and undistorted on other people’s PCs.

SDL already has support for automatic letterboxing IIRC, however your
GPU / OS must not purport to support a good unboxed display resolution
is available!

SDL selects the closest available video mode, and can add black bars to
that if necessary, true. The problem is a bit more involved though.

The problem is essentially that monitors support display modes with
non-native aspect ratios, and stretch these. For example, I’ve got a
widescreen monitor with a 16:10 aspect ratio that reports 640x480 (a
4:3 aspect ratio) as a supported mode. If SDL wants a fullscreen
640x480 mode, it sees this mode and uses it directly, which results in
stretching. From SDL’s point of view, there’s no problem. There’s no way
it can know that the display is distorted.

Because of this, simply adding black bars isn’t enough. The only way
to guarantee a correct, undistorted fullscreen display on an LCD monitor
is to always use the monitor’s native resolution (typically the
desktop resolution), and manually scale the view to fill the screen,
adding black bars where appropriate.

In a perfect world this wouldn’t be necessary, but:

  • Many monitors doesn’t support aspect-ratio display at all.
  • Some monitors do support aspect-ratio display, but default to
    stretching, and doesn’t save the setting so it has to be manually set
    every time the monitor switches resolution.
  • In any case, the quality of monitor stretched images varies wildly.
  • Some graphics card drivers has an option (not enabled by default) to
    override monitor scaling and do it on the gfx card in stead, but not
    all drivers have this.
  • Even those that do doesn’t always support aspect-ratio display.
  • Even those that do support it doesn’t always actually work.
  • And lastly, even if someone’s got a nice monitor that supports aspect
    ratio display, or graphics drivers that can fix the problem, they may
    not actually know about it, or not know how to configure it.

The solution (or rather, work-around) to all these problems is to use
native resolution and do the scaling yourself, like I said. This isn’t
uncommon practice in recent 2D games, and it’d be nice if SDL supported
it directly.

  • Gerry

Den Wed, 29 Apr 2009 22:26:13 -0400
skrev Donny Viszneki <@Donny_Viszneki>:

This would be great. ?Specially since most LCD monitors have this
unfortunate tendency to stretch fullscreen modes to the entire
screen while ignoring aspect ratio, so for fixed-resolution stuff
you pretty much have to add black bars yourself to keep everything
from getting distorted. ?For pixel graphics, having the option of
integer only zoom factors would also be nice.

This is really an unrelated issue, both from SDL’s scaling abilities
and from Gerry’s suggestion.

The problem you’re having is that neither will your display nor your
GPU / OS “letterbox” your picture.

Letterbox, pillowbox, or in the case of integer-scaled pixel graphics,
possibly both. ?In any case, it’s a very common problem, and this
isn’t just about me. ?I doubt I’m the only one that wants their
games to display properly and undistorted on other people’s PCs.

Yes, but the feature you’re asking for will not display your games
"properly and undistorted."

SDL already has support for automatic letterboxing IIRC, however your
GPU / OS must not purport to support a good unboxed display resolution
is available!

SDL selects the closest available video mode, and can add black bars to
that if necessary, true. ?The problem is a bit more involved though.

The problem is essentially that monitors support display modes with
non-native aspect ratios, and stretch these. ?For example, I’ve got a
widescreen monitor with a 16:10 aspect ratio that reports 640x480 (a
4:3 aspect ratio) as a supported mode.

The proper solution is to get your operating system to deny these
video modes for that display.

If SDL wants a fullscreen
640x480 mode, it sees this mode and uses it directly, which results in
stretching. From SDL’s point of view, there’s no problem. There’s no way
it can know that the display is distorted.

Because of this, simply adding black bars isn’t enough.

Actually, that would work, and is in fact a better suggestion. Why not
an environment variable like SDL_FORCERESOLUTION which would force SDL
to ask for a certain resolution, and then letterbox the client
application’s content as discussed above?

The only way
to guarantee a correct, undistorted fullscreen display on an LCD monitor
is to always use the monitor’s native resolution (typically the
desktop resolution), and manually scale the view to fill the screen,
adding black bars where appropriate.

I think you need to qualify your “only way” with the perspective from
which you are speaking. Do you mean the only way for an SDL game? The
only way for your graphics driver?

In a perfect world this wouldn’t be necessary, but:

  • Many monitors doesn’t support aspect-ratio display at all.
  • Some monitors do support aspect-ratio display, but default to
    stretching, and doesn’t save the setting so it has to be manually set
    every time the monitor switches resolution.
  • In any case, the quality of monitor stretched images varies wildly.
  • Some graphics card drivers has an option (not enabled by default) to
    override monitor scaling and do it on the gfx card in stead, but not
    all drivers have this.
  • Even those that do doesn’t always support aspect-ratio display.
  • Even those that do support it doesn’t always actually work.
  • And lastly, even if someone’s got a nice monitor that supports aspect
    ratio display, or graphics drivers that can fix the problem, they may
    not actually know about it, or not know how to configure it.

What is “aspect-ratio display?” I understand “aspect ratio” and
"display," but not both combined.

The solution (or rather, work-around) to all these problems is to use
native resolution and do the scaling yourself, like I said. ?This isn’t
uncommon practice in recent 2D games, and it’d be nice if SDL supported
it directly.

I feel inclined to clarify that while “letterboxing” is a means of
adapting a picture to another display resolution, it is not the same
as what I was calling “scaling,” which is more clearly referred to as
"resampling."

I was also assuming that nobody would need to want to resample their
picture rather than integer-scale and/or letterbox. Perhaps I am
wrong. How do you feel about resampling games’ framebuffers before
displaying them?On Wed, Apr 29, 2009 at 11:44 PM, Gerry JJ wrote:


http://codebad.com/

[…aspect ratio, scaling etc…]

The solution (or rather, work-around) to all these problems is to
use native resolution and do the scaling yourself, like I said.
This isn’t uncommon practice in recent 2D games, and it’d be nice if
SDL supported it directly.

Be warned though; large display can be very expensive to drive in
their native resolutions! (A 30" is normally 2560x1600, for example.
That is, 4 Mpixel, or 16 MB/frame in 32 bpp mode.)

With a serious video card (or two…) and hardware scaling, it works
fine, but budget video cards or software scaling just doesn’t cut it.

I would suggest scaling to the lowest supported resolution that has
the same aspect ratio as the native resolution, or possibly the
lowest supported resolution that is an integer factor lower than the
native resolution. Unless you’re using interpolated/filtered scaling,
this doesn’t even affect the visual appearance, but obviously, it
saves loads of bandwidth.On Thursday 30 April 2009, Gerry JJ wrote:


//David Olofson - Programmer, Composer, Open Source Advocate

.------- http://olofson.net - Games, SDL examples -------.
| http://zeespace.net - 2.5D rendering engine |
| http://audiality.org - Music/audio engine |
| http://eel.olofson.net - Real time scripting |
’-- http://www.reologica.se - Rheology instrumentation --’

The problem is essentially that monitors support display modes with
non-native aspect ratios, and stretch these. ?For example, I’ve got a
widescreen monitor with a 16:10 aspect ratio that reports 640x480 (a
4:3 aspect ratio) as a supported mode. If SDL wants a fullscreen
640x480 mode, it sees this mode and uses it directly, which results in
stretching. From SDL’s point of view, there’s no problem. There’s no way
it can know that the display is distorted.

CRT monitors behaved just the same, it’s just that back then, people
assuming a 4:3 aspect ratio had better odds of being right. But that
didn’t make them any more correct, and for those very rare widescreen
CRTs, they’d put on 640x480, and everything would be “fat” (stretched
out horizontally). Video drivers (or maybe Windows? not sure) are the
culprit here, they get the monitor information via EDID (or in the
olden days, on Windows, through having installed/selected the monitor
"driver"), but claim non-native aspect ratio resolutions.

On Windows, you’ll notice a 1280x1024 resolution, very often. That’s
not 4:3, it’s 5:4, and if you select it on a 4:3 CRT monitor, it gets
stretched out horizontally, no pillarboxing. You could say that when
you have an LCD that has optional pillar/letterboxing, it’s a step up
from what we had with CRTs.

And that other person pointing out that driving a 30" monitor at full
resolution in software might be crazy, he’s quite right, and that’s
where the bit I said before about using the more compatible but
software-only GDI driver can lose out pretty massively (that’s 480
MB/s at 32 bpp, 30 fps! that’s 2 lanes of PCIe 1.x, just for 2D!).On Wed, Apr 29, 2009 at 11:44 PM, Gerry JJ wrote:


http://pphaneuf.livejournal.com/

Den Thu, 30 Apr 2009 04:25:19 -0400
skrev Donny Viszneki <donny.viszneki at gmail.com>:

Yes, but the feature you’re asking for will not display your games
"properly and undistorted."

Well, someone’s misunderstanding here, obviously. What feature do you
think I’m asking for?

The proper solution is to get your operating system to deny these
video modes for that display.

You’re missing the point. As I explained, this isn’t always possible,
and even if it is, you can fix it for yourself but not everyone else
who happens to play your game. Telling people to fix their OS isn’t a
solution.

Because of this, simply adding black bars isn’t enough.

Actually, that would work, and is in fact a better suggestion.

No, it wouldn’t. Read my mail again.

Why not
an environment variable like SDL_FORCERESOLUTION which would force SDL
to ask for a certain resolution, and then letterbox the client
application’s content as discussed above?

You’d prefer having to mess with environment variables over a proper
API? This also doesn’t allow for scaling, unless you add environment
variables for that… ugh.

I think you need to qualify your “only way” with the perspective from
which you are speaking. Do you mean the only way for an SDL game? The
only way for your graphics driver?

As I said in my mail, both monitors and drivers are broken in a
multitude of ways, which leaves few options that work. Again, the point
here isn’t just to get something that work on your own pc, but something
that works as on many pcs as possible.

What is “aspect-ratio display?” I understand “aspect ratio” and
"display," but not both combined.

Well, I’m not a native english speaker, so please excuse if some things
are unclear, and don’t be afraid of asking for clarifications. What I
mean is showing the contents of the window, which when fullscreen
covers the entire display (ie there’s no window decorations), with
correct aspect ratio.

I feel inclined to clarify that while “letterboxing” is a means of
adapting a picture to another display resolution, it is not the same
as what I was calling “scaling,” which is more clearly referred to as
"resampling."

I was also assuming that nobody would need to want to resample their
picture rather than integer-scale and/or letterbox. Perhaps I am
wrong. How do you feel about resampling games’ framebuffers before
displaying them?

It depends on the game, type of graphics, if you’re scaling up or
down, the method of resampling … Different games have different
needs. Integer-factor nearest neighbour is best for scaling up pixel art
for example, but that doesn’t mean you have to use it when scaling
photos. In any case, different people have different preferences.

  • Gerry

Den Thu, 30 Apr 2009 16:55:04 +0200
skrev David Olofson :

[…aspect ratio, scaling etc…]

The solution (or rather, work-around) to all these problems is to
use native resolution and do the scaling yourself, like I said.
This isn’t uncommon practice in recent 2D games, and it’d be nice if
SDL supported it directly.

Be warned though; large display can be very expensive to drive in
their native resolutions! (A 30" is normally 2560x1600, for example.
That is, 4 Mpixel, or 16 MB/frame in 32 bpp mode.)

With a serious video card (or two…) and hardware scaling, it works
fine, but budget video cards or software scaling just doesn’t cut it.

Yeah, that is a problem… You’d think people with such huge monitors
could afford graphics cards to drive them though =). Anyway, perhaps
hardware scaling could be detected, and native resolution with scaling
only used if hardware scaling is available? That might mean distorted
display on too old pcs, but it’s still better than the game running too
slow to play when it’d otherwise work fine.

I would suggest scaling to the lowest supported resolution that has
the same aspect ratio as the native resolution, or possibly the
lowest supported resolution that is an integer factor lower than the
native resolution. Unless you’re using interpolated/filtered scaling,
this doesn’t even affect the visual appearance, but obviously, it
saves loads of bandwidth.

Good idea =). Although, there are some LCD monitors and gfx drivers
that silently blur the image when scaling up even integer factors, so
image quality might not be the best with this. Also, native resolution
usually means no delay for mode-switching (some monitors can take
several seconds to switch). Still, it’s an option…

  • Gerry> On Thursday 30 April 2009, Gerry JJ wrote:

Den Thu, 30 Apr 2009 13:01:38 -0400
skrev Pierre Phaneuf :

The problem is essentially that monitors support display modes with
non-native aspect ratios, and stretch these. ?For example, I’ve got
a widescreen monitor with a 16:10 aspect ratio that reports 640x480
(a 4:3 aspect ratio) as a supported mode. If SDL wants a fullscreen
640x480 mode, it sees this mode and uses it directly, which results
in stretching. From SDL’s point of view, there’s no problem.
There’s no way it can know that the display is distorted.

CRT monitors behaved just the same, it’s just that back then, people
assuming a 4:3 aspect ratio had better odds of being right. But that
didn’t make them any more correct, and for those very rare widescreen
CRTs, they’d put on 640x480, and everything would be “fat” (stretched
out horizontally). Video drivers (or maybe Windows? not sure) are the
culprit here, they get the monitor information via EDID (or in the
olden days, on Windows, through having installed/selected the monitor
"driver"), but claim non-native aspect ratio resolutions.

On Windows, you’ll notice a 1280x1024 resolution, very often. That’s
not 4:3, it’s 5:4, and if you select it on a 4:3 CRT monitor, it gets
stretched out horizontally, no pillarboxing. You could say that when
you have an LCD that has optional pillar/letterboxing, it’s a step up
from what we had with CRTs.

True, but I haven’t seen a CRT monitor that didn’t have controls for
stretching the image to correct the aspect ratio, and many also
had position controls so you could add black bars yourself. Also, most
CRTs I’ve seen would remember settings for the next time that mode was
entered, so you wouldn’t have to set them again the next time, which is
already better than most LCDs. Many LCD monitors have no letterboxing
option at all, and just always stretch.

And that other person pointing out that driving a 30" monitor at full
resolution in software might be crazy, he’s quite right, and that’s
where the bit I said before about using the more compatible but
software-only GDI driver can lose out pretty massively (that’s 480
MB/s at 32 bpp, 30 fps! that’s 2 lanes of PCIe 1.x, just for 2D!).

Yeah, scaling should probably only be used if it can be done in
hardware (see my reply to David).

  • Gerry> On Wed, Apr 29, 2009 at 11:44 PM, Gerry JJ <@Gerry_Jo_Jellestad> wrote:

Den Thu, 30 Apr 2009 04:25:19 -0400
skrev Donny Viszneki <@Donny_Viszneki>:

Yes, but the feature you’re asking for will not display your games
"properly and undistorted."

Well, someone’s misunderstanding here, obviously. What feature do you
think I’m asking for?

You’re asking for non integer-scale picture resampling. There is
nothing “proper and undistorted” about that.

The proper solution is to get your operating system to deny these
video modes for that display.

You’re missing the point.

I didn’t miss any point. I’m just being thorough. :\

Sorry for the confusion.

Because of this, simply adding black bars isn’t enough.

Actually, that would work, and is in fact a better suggestion.

No, it wouldn’t. ?Read my mail again.

Why not
an environment variable like SDL_FORCERESOLUTION which would force SDL
to ask for a certain resolution, and then letterbox the client
application’s content as discussed above?

You’d prefer having to mess with environment variables over a proper
API? ?This also doesn’t allow for scaling, unless you add environment
variables for that… ugh.

I’m a bit confounded by this entire reaction. What API are you
proposing? SDL_AskUserIfPictureLooksStretched()? Let’s consider a few
things:

Environment variables provide both an informal API (via the setenv()
API) and a way for users to set SDL options without the
application’s intervention. That means those games you say we want to
be displayed a certain way on our feature-deficient displays will be
able to display properly even if it was written before this new API
you propose is created. But let’s also take a look at the way SDL
already deals with changing SDL options which oughtn’t require
application intervention:

I cannot think of even a single SDL application which has a dialog
asking me what video back-end (SDL_VIDEODRIVER environment variable)
or what audio back-end (SDL_AUDIODRIVER) I want to use. You might be
able to make the argument that without a functioning video back-end,
perhaps the application cannot ask the user anything and so there
needs to be an outside-the-application solution for setting this
option. But SDL_VIDEODRIVER is only one such option of several.

Why then would letterboxing and integer-scaling/resampling the picture
be different?

The most confounding thing is that if you really want a formal API for
doing this, it already exists! Just draw your picture to an
intermediate buffer! It’s pretty painless, what more could you
possibly want? SDL_PleaseResampleVideoSurfaceForMe()?

On the other hand, if we want this option to provide a picture scaling
solution that is backward compatible with existing SDL/JEDI/Pygame
applications, the proposed environment variable route actually works
as opposed to adding a formal C API specifically for this feature.

Proposed environment variable route:

SDL_VIDEORESOLUTION = "WxH"
SDL_VIDEOSCALE = "WxH"
SDL_VIDEORESAMPLE = integer

The first option forces SDL to ask for a certain video mode and
letterbox as necessary. The second option forces SDL to provide the
programmer with an intermediate buffer when they would ordinarily get
the video surface, and then integer-scale and/or resample the picture
as necessary to fit the user’s specifications. The third option would
specify a resampling algorithm as per SDL_TextureScaleMode:

/**

  • \enum SDL_TextureScaleModeOn Thu, Apr 30, 2009 at 7:27 PM, Gerry JJ wrote:
  • \brief The texture scale mode used in SDL_RenderCopy()
    */
    typedef enum
    {
    SDL_TEXTURESCALEMODE_NONE = 0x00000000, /< No scaling,
    rectangles must match dimensions */
    SDL_TEXTURESCALEMODE_FAST = 0x00000001, /
    < Point sampling or
    equivalent algorithm */
    SDL_TEXTURESCALEMODE_SLOW = 0x00000002, /< Linear filtering
    or equivalent algorithm */
    SDL_TEXTURESCALEMODE_BEST = 0x00000004 /
    < Bicubic filtering
    or equivalent algorithm */
    } SDL_TextureScaleMode;

Comments?


http://codebad.com/

[…]

Den Thu, 30 Apr 2009 16:55:04 +0200
skrev David Olofson <@David_Olofson>:
[…]

Be warned though; large display can be very expensive to drive
in their native resolutions! (A 30" is normally 2560x1600, for
example. That is, 4 Mpixel, or 16 MB/frame in 32 bpp mode.)

With a serious video card (or two…) and hardware scaling, it
works fine, but budget video cards or software scaling just
doesn’t cut it.

Yeah, that is a problem… You’d think people with such huge
monitors could afford graphics cards to drive them though =).

Well, yeah - and with the current generation of video cards, you
shouldn’t even need more than one. :slight_smile:

Either way, I’m driving mine with two overclocked 8800GT cards, backed
by a 3 GHz Core 2 Quad, and anything that uses OpenGL or DirectX runs
just fine in 2560x1600. That includes Kobo Deluxe in glSDL mode - but
SDL software rendering (X11) results in a slidshow… hehe

Anyway, perhaps hardware scaling could be detected, and native
resolution with scaling only used if hardware scaling is available?

Well, one would think that if the system reports say, 1280x800 as
available, that resolution would somehow be scaled appropriately,
either by the video card or by the monitor. Depends on the OS, driver
and configuration, though… At least, the nVidia drivers (Linux and
Windows) allow the user to configure how non-native resolutions are
handled.

That might mean distorted display on too old pcs, but it’s still
better than the game running too slow to play when it’d otherwise
work fine.

Yeah… Either way, there is probably no 100% safe automatic solution,
so I guess there’ll always be some users that need that “Advanced
Video Options” tab.

I would suggest scaling to the lowest supported resolution that
has the same aspect ratio as the native resolution, or possibly
the lowest supported resolution that is an integer factor lower
than the native resolution. Unless you’re using
interpolated/filtered scaling, this doesn’t even affect the visual
appearance, but obviously, it saves loads of bandwidth.

Good idea =). Although, there are some LCD monitors and gfx drivers
that silently blur the image when scaling up even integer factors,
so image quality might not be the best with this.

Yeah, I know… The 30" Apple Cinema doesn’t blur integer scale
factors - but then it hardly supports scaling at all! :smiley: IIRC, it’ll
take 1024x768 and scale that with “filtering”, but other than that,
it’s 2560x1600 and integer factors only.

Also, native resolution usually means no delay for mode-switching
(some monitors can take several seconds to switch). Still, it’s an
option…

That’s nice, of course. Not that I see why LCDs should need
this “setup time” (detecting the input signal and recalculating a few
filter parameters takes that long…?), but apparently, they do…On Friday 01 May 2009, Gerry JJ wrote:


//David Olofson - Programmer, Composer, Open Source Advocate

.------- http://olofson.net - Games, SDL examples -------.
| http://zeespace.net - 2.5D rendering engine |
| http://audiality.org - Music/audio engine |
| http://eel.olofson.net - Real time scripting |
’-- http://www.reologica.se - Rheology instrumentation --’

Den Thu, 30 Apr 2009 19:59:47 -0400
skrev Donny Viszneki <donny.viszneki at gmail.com>:

Den Thu, 30 Apr 2009 04:25:19 -0400
skrev Donny Viszneki <donny.viszneki at gmail.com>:

Yes, but the feature you’re asking for will not display your
games “properly and undistorted.”

Well, someone’s misunderstanding here, obviously. What feature do
you think I’m asking for?

You’re asking for non integer-scale picture resampling. There is
nothing “proper and undistorted” about that.

No, I’m asking for integer OR non-integer-scale picture resampling,
depending on the game’s needs, with black borders added to maintain
correct aspect ratio. The distortion I’m talking about is incorrect
aspect ratios. A circle is a circle, not an oval; a square is a
square, not a rectangle.

I take it you don’t like non-integer-scale resampling, and that the
resulting blur is what you mean by distortion? Well, I actually agree
with that to some extent, but that doesn’t mean that it should be
impossible to do, and I do think it makes sense for some things.

To clarify: I do not want incorrect aspect ratios. I also do not want
tiny stamp-sized displays on a huge black void in fullscreen mode.
Monitors and gfx drivers conspire to make this combination difficult.
Hence, native resolution, scaling and black bars.

You’re missing the point.

I didn’t miss any point. I’m just being thorough. :\

Sorry for the confusion.

Hey no problem, we can work this out =). I do still think you’re
misunderstanding me though.

You’d prefer having to mess with environment variables over a proper
API? ?This also doesn’t allow for scaling, unless you add
environment variables for that… ugh.

I’m a bit confounded by this entire reaction. What API are you
proposing? SDL_AskUserIfPictureLooksStretched()? Let’s consider a few
things:

No, of course not. My entire point is that the user (as in, the person
playing the game) shouldn’t have to do anything to get an image with
the correct aspect ratio, possibly scaled (integer or non, depending)
to fill the screen as much as possible, while maintaining correct
aspect ratio.

From a user’s point of view:

  • They launch the game.
  • Game goes fullscreen (might be default, user might switch from
    windowed mode, doesn’t matter).
  • Game magically displays with correct aspect ratio. And they didn’t
    have to configure a thing! Yay!

I like it when things Just Work. I’d like my games to Just Work for
other people, as well. They should not have to mess with environment
variables to get things to look correctly, to me that’s a horrible
solution.

Also, we’re talking about SDL 1.3 here. There’s no need to add
additional environment variable hacks.

Anyway, to do this reliably, from the developer’s point of view, you’ll
have to use native resolution and scale the image to fill the screen
while maintaining correct aspect ratio, as I went over before. SDL
could potentially even do this automatically, but an API to select what
to do would be nice (scaling type if any, integer or non-integer, etc).
(As a bonus, this could also be used to add black bars and scaling in
resizable windowed modes.)

  • Gerry> On Thu, Apr 30, 2009 at 7:27 PM, Gerry JJ <@Gerry_Jo_Jellestad> wrote:

Den Fri, 1 May 2009 02:04:12 +0200
skrev David Olofson :

Anyway, perhaps hardware scaling could be detected, and native
resolution with scaling only used if hardware scaling is available?

Well, one would think that if the system reports say, 1280x800 as
available, that resolution would somehow be scaled appropriately,
either by the video card or by the monitor. Depends on the OS, driver
and configuration, though… At least, the nVidia drivers (Linux and
Windows) allow the user to configure how non-native resolutions are
handled.

Yeah, you’d think so, but unfortunately the opposite is far more
likely currently at least, unless 1280x800 happens to be the native
resolution. Also, since you mention the nvidia drivers, in my
experience their scaling override doesn’t even always work.

That might mean distorted display on too old pcs, but it’s still
better than the game running too slow to play when it’d otherwise
work fine.

Yeah… Either way, there is probably no 100% safe automatic
solution, so I guess there’ll always be some users that need that
"Advanced Video Options" tab.

Yep.

Yeah, I know… The 30" Apple Cinema doesn’t blur integer scale
factors - but then it hardly supports scaling at all! :smiley: IIRC, it’ll
take 1024x768 and scale that with “filtering”, but other than that,
it’s 2560x1600 and integer factors only.

So anything other than 1024x768 and 2560x1600 gives you a world of black
bars everywhere, or are other modes just not supported at all?

Also, native resolution usually means no delay for mode-switching
(some monitors can take several seconds to switch). Still, it’s an
option…

That’s nice, of course. Not that I see why LCDs should need
this “setup time” (detecting the input signal and recalculating a few
filter parameters takes that long…?), but apparently, they do…

Yeah, I was very surprised when i first found out about that too.
Doesn’t make sense to me that an LCD should take three times longer
than even a slow CRT to switch resolutions, but there you go.

  • Gerry

True, but I haven’t seen a CRT monitor that didn’t have controls for
stretching the image to correct the aspect ratio, and many also
had position controls so you could add black bars yourself. Also, most
CRTs I’ve seen would remember settings for the next time that mode was
entered, so you wouldn’t have to set them again the next time, which is
already better than most LCDs. ?Many LCD monitors have no letterboxing
option at all, and just always stretch.

You could tweak the aspect ratio a little bit by increasing the
margins, but not by huge amount, true. None of this would be
automatic, either (although it would be remembered, usually for a
certain maximum number of resolutions, true).

There’s some wild combinations of LCD monitor capabilities, all the
way to the 30" HP I have at work, which can’t do anything at all, no
stretching, no contrast adjustment, no on-screen display, zilch
(there’s four buttons on it, power, input selector switch, and
brightness +/-).

On the other hand, my 40" LCD TV (which can take a DVI input) knows
how to put all the appropriate black bars by itself, and it’s cheaper,
bigger and higher resolution than a number of computer monitors I’ve
bought!On Thu, Apr 30, 2009 at 7:45 PM, Gerry JJ wrote:


http://pphaneuf.livejournal.com/

Den Thu, 30 Apr 2009 19:59:47 -0400

  • They launch the game.
  • Game goes fullscreen (might be default, user might switch from
    ?windowed mode, doesn’t matter).
  • Game magically displays with correct aspect ratio. And they didn’t
    ?have to configure a thing! Yay!

I like it when things Just Work. ?I’d like my games to Just Work for
other people, as well. ?They should not have to mess with environment
variables to get things to look correctly, to me that’s a horrible
solution.

Anyway, to do this reliably, from the developer’s point of view, you’ll
have to use native resolution and scale the image to fill the screen
while maintaining correct aspect ratio, as I went over before. ?SDL
could potentially even do this automatically, but an API to select what
to do would be nice (scaling type if any, integer or non-integer, etc).
(As a bonus, this could also be used to add black bars and scaling in
resizable windowed modes.)

Ah, well this is where we diverge.

Also, we’re talking about SDL 1.3 here. There’s no need to add
additional environment variable hacks.

What does 1.3 have to do with environment variables?On Thu, Apr 30, 2009 at 8:46 PM, Gerry JJ wrote:


http://codebad.com/

By which I mean: how can this be done? It seems not possible to me. If
the monitor + driver + OS / X are not giving the information to SDL,
what recourse does SDL have?On Fri, May 1, 2009 at 2:42 AM, Donny Viszneki <@Donny_Viszneki> wrote:

On Thu, Apr 30, 2009 at 8:46 PM, Gerry JJ wrote:

Den Thu, 30 Apr 2009 19:59:47 -0400

  • They launch the game.
  • Game goes fullscreen (might be default, user might switch from
    ?windowed mode, doesn’t matter).
  • Game magically displays with correct aspect ratio. And they didn’t
    ?have to configure a thing! Yay!

I like it when things Just Work. ?I’d like my games to Just Work for
other people, as well. ?They should not have to mess with environment
variables to get things to look correctly, to me that’s a horrible
solution.

Anyway, to do this reliably, from the developer’s point of view, you’ll
have to use native resolution and scale the image to fill the screen
while maintaining correct aspect ratio, as I went over before. ?SDL
could potentially even do this automatically, but an API to select what
to do would be nice (scaling type if any, integer or non-integer, etc).
(As a bonus, this could also be used to add black bars and scaling in
resizable windowed modes.)

Ah, well this is where we diverge.


http://codebad.com/

Den Fri, 1 May 2009 00:38:40 -0400
skrev Pierre Phaneuf :

On the other hand, my 40" LCD TV (which can take a DVI input) knows
how to put all the appropriate black bars by itself, and it’s cheaper,
bigger and higher resolution than a number of computer monitors I’ve
bought!

Yeah… People generally seem to care more about these things for TVs
than monitors for some reason, so TVs tend to be better in that regard.
Maybe people should be a little bit more selective when choosing a
monitor, eh? :wink:

  • Gerry

Den Fri, 1 May 2009 02:44:53 -0400
skrev Donny Viszneki <donny.viszneki at gmail.com>:

Ah, well this is where we diverge.

By which I mean: how can this be done? It seems not possible to me. If
the monitor + driver + OS / X are not giving the information to SDL,
what recourse does SDL have?

That, my friend, is where native resolution comes in. A monitor’s
native resolution is by definiton the perfect display mode for that
monitor, so it’s pretty much guaranteed to not be distorted in any way,
assuming the monitor uses square pixels (a few don’t, but I don’t think
there’s any way to find out this, so let’s just ignore that).

SDL could perhaps find the native resolution by querying EDID, or
simpler, just assume that the desktop resolution is the native one,
which is usually the case. The latter would also automatically work
with CRT monitors.

What does 1.3 have to do with environment variables?

1.2 API is frozen, 1.3 API isn’t.

  • Gerry

Donny Viszneki wrote:

By which I mean: how can this be done? It seems not possible to me. If
the monitor + driver + OS / X are not giving the information to SDL,
what recourse does SDL have?

At the minimum, SDL could allow you to specify logical resolution and
physical resolution separately.–
Rainer Deyke - rainerd at eldwood.com