SDL 3.0

However, this can lead to bloatware in cases where gamma correct color blending/image scaling is not natively supported by the hardware. After all, ‘anti-aliasing’ involves rendering at 2×2 or 4×4 or 6×5 times the native resolution, then gamma correct box filter to target video image, and it is the latter part that is the bottleneck when performed in software. And it is essential that the color blending is gamma correct or else it will not end up with any better quality than direct rendering to target resolution. Having built-in support for ‘anti-aliasing’ is going to implicitly encourage many developers to put bloatware in their graphics and unnecessarily increase system requirements. There are already many HTML5 apps that suffer from such bloatware rendering and they don’t even get the anti-aliasing right.

But what if code compiled with uint32_t being unsigned long type is linked with SDL3 library compiled with uint32_t being unsigned int type?

There should be a way to get the system virtual pixel density (which tends to be separate from physical density, at least on Windows and Android) and get the system virtual “100%” density (which tends to be 96dpi on Windows and 160dpi on Android) so that it is possible to make SDL3 interface scale similarly to the system scale. When virtual pixel changes in size relative to virtual inch (like changing from 100% to 125% in Windows 10 settings or moving between screens of different virtual pixel density), there should be an event for it. However, this shouldn’t distract SDL3 from Windows XP support, since Vista is more expensive to run, and Windows 7 even more expensive, not to mention that if there is no more useful software for a system it often goes to e-waste. If an SDL3 app on Windows doesn’t run on Windows XP or Server 2003, then what’s even the point?

Not necessarily, indeed that’s a crude way of doing it. It’s perfectly possible to render anti-aliased graphics at the native resolution. This image was generated that way (SDL2), no over-sampling of any kind was involved.

aagfxdem

First of all, this isn’t even gamma-correct, making the anti-aliasing effectively useless here since the blending is not linear. Second of all, the rendering algorithm doesn’t matter because anti-aliasing will always involve color blending which is the main bottleneck when done in software. And doing anything else than oversampling is very likely going to lead to glitchings when many overlaps are involved. These two issues are evident in many FreeType based renderers for instance as the library is highly prone to being misused (the ‘smooth’ module is completely unusable for TrueType fonts, and FreeType itself doesn’t do any alpha blending so it is prone to be done incorrectly).

If you’re that concerned about gamma, do all your anti-aliasing and blending in a linear (e.g. 16-bit) RGB buffer and then gamma-correct, just once, when you finally transfer it to the screen. The GPU will probably do that for you.

But although gamma correction may be desirable to get the very best results, anti-aliasing done without it can be perfectly acceptable in many circumstances, and is hugely better than not doing anti-aliasing at all!

Practical (and fast) antialiasing may require approximations to be made, but that doesn’t matter if the end result is subjectively acceptable. Given that it will always be the ‘wetware’ eye-brain system that ultimately receives the image, striving for perfection is pointless!

1 Like

That’s why the user has to use OpenGL, DirectX, or Metal directly, and the “Also, I would like to see built-in support for anti-aliasing without having to use OpenGL, DirectX, or Metal directly.” suggestion isn’t going to work.

not really, incorrect color blending leads to all sorts of artifacts that bilevel rendering doesn’t have

I don’t think misuse of sRGB values even deserves to be called an approximation given how inaccurate it is. A reasonable approximation would be if the conversions between linear and sRGB were made with approximate formulas (such as gamma 2.2 or lookup tables) but still round trip and the halfway blend between 00 and FF is close to BC. And in cases anti-aliasing is impractical, bilevel rendering is going to be definitive.

How is SDL3 going to handle the situations that cause bad UTF-8 in Android clipboard and Surface glitching in SDL2?

There’s no reason, that I can see, why SDL3 shouldn’t include gamma-correction as a built-in feature, at the same time as adding support for antialiasing. It might mandate using a target texture, but I always do that anyway.

Your eyes must be very different from mine! Aliasing (jaggies) on edges is grossly unpleasant to me, and leads to an entirely unrealistic rendering. Even approximate antialiasing (such as my animated image above) is vastly better in my judgment.

It would be bloatware when used in full stage software rendering every frame, though it could be reasonable to include correct conversions between sRGB and linear for individual color usage and linear color transformations.

First of all, the term ‘aliasing’ refers to when a frequency higher than Nyquist limit turns into a lower frequency by sampling, it’s not really something that happens on an individual edge basis. Aliasing tends to occur when high frequency patterns get involved, in which case incorrect blending gets an unusable result as well. Some users even use ‘aliasing’ as a pejorative term to refer to bilevel rendering in general; this kind of usage implicitly encourages e-waste since bilevel rendering is more likely to be the only reasonable render mode in legacy computers (an extreme case of this unethical practice is in Windows Terminal; not only is bilevel render mode called aliased in there, but also there is no way to get full hinting with anti-aliasing, it’s not gamma-correct by default, and uses Cascadia Code/Mono by default which isn’t even properly aligned for box drawing characters; it is highly likely that Microsoft text renderers are only going to get worse in the future). I don’t think there is anything inherently ‘unrealistic’ about taking the center of the pixel; having bias towards zero when blending isn’t any more realistic after all.

I know exactly what aliasing in images is, it’s a subject I specialised in during my 33-year-long career with BBC Research & Development! The ‘jaggies’ on non-vertical or horizontal edges are a classic example of aliasing. They occur precisely because the (notional) original signal wasn’t filtered to remove frequencies above the Nyquist Limit before sampling.

You really should read Joel Spolsky’s classic article outlining precisely how ridiculous it is to use the term “bloatware” like that.

Most people are not doing software rendering, let alone the kind that you apparently are. The person above asking for anti-aliasing was almost certainly referring to having it supported in SDL_Renderer.

Of course, the problem with having it built in to SDL_Renderer is that there are a lot of different ways to do antialiasing, and there is no single solution that is both supported on all hardware and fast on all hardware.

LOL, :roll_eyes:

3 Likes

rtrussell Mason_Wheeler sjr At this point it seems like you are deliberately trying to annoy me or something rather than actually looking forward to SDL 3.0.

I am looking forward to SDL 3. Especially SDL_GPU.

3 Likes

However, “'LOL, :roll_eyes:” is an insult that is not leading anywhere useful and only hinders progress.

1 Like

Does this mean we will be able to compile and run games written for SDL 1.2 using SDL 3 by somehow chaining sdl12-compat with sdl2-compat?

Yes, this is intended to work, and we’ll consider it a bug when it doesn’t.

We probably won’t update sdl12-compat to talk to SDL3 directly, but chaining like this will work.

At one time, I tried to switch to SDL_GPU, but as it turned out, glScissor was actively used somewhere inside the library, which led to a significant decrease in FPS on the Adreno GPU.
Therefore, I abandoned SDL_GPU and simply added the drawing of triangles in SDL myself, until this was not yet in SDL itself.

The older third-party SDL_gpu library isn’t related to the upcoming GPU subsystem in SDL3 at all, aside from sharing a name. They both use the GPU as well of course, but with very different design philosophies and functionality. (The SDL3 GPU subsystem doesn’t have an OpenGL backend, for example.)

1 Like

Yeah, I was talking about the new GPU subsystem that’s going to be in SDL 3.

Oh, thanks, I get it now. Meanwhile, where can I read about the upcoming SDL_GPU subsystem?

where can I read about the upcoming SDL_GPU subsystem?

I’ve collected some FAQs and summary info here:

3 Likes