With the release of SDL 2.26, development focus is shifting to SDL 3.0.
We have 10 years of wishlist items that have been waiting for a new major version where we can safely break the ABI, and we are finally ready to make that change.
We will still continue to provide bug fixes for SDL 2, but new features and complex changes will be implemented in SDL 3.
Our general plan is:
Change the version number and library name so it will not conflict with SDL2 installations.
Create sdl2-compat, a binary compatibility library that will run SDL2 applications on the SDL3 runtime, similar to what sdl12-compat has done for ancient SDL 1.2 games.
Separate the wiki so we have documentation for both SDL2 and SDL3
Go through and do code and API cleanup
Review the set of supported build systems
Review the set of supported platforms
Review the set of supported APIs
Review hints for promotion to new API functions
Create SDL 3.1.0 prerelease candidate for validation and public feedback
Iterate as needed to create a clean stable ABI for SDL 3.x
Start development on new features
Here is our current SDL 3.0 bug list, which will certainly evolve over time:
Please donât bring the various SDL add-ons into the main library. Too many dependencies.
Also, the suggestion for more consistent naming, like SDL_RenderDoThing() for everything instead of some stuff being SDL_DoThingRender() is definitely a good idea.
Please take this opportunity to reconsider how HighDPI works.
Right now some functions operate in physical pixels, some operate in scaled (logical) pixels, itâs not always clear which is which, and there is no simple way to convert between them. This is a mess, and basically makes it impossible to write games with HighDPI support on a StandardDPI computer.
My first choice would be to always use physical pixels, no exceptions.
My second choice would be to define a global DPIScaling variable that is initialized to the ânativeâ scaling factor provided by the OS, but can be changed at will. All APIs use physical pixels scaled by the DPIScaling variable, no exceptions. If DPIScaling is left unchanged, we get the behavior of not using SDL_WINDOW_ALLOW_HIGHDPI, where the HighDPI display pretends to be a StandardDPI display in all regards. If DPIScaling is set to 1, we always use physical pixels, no exceptions. If DPIScaling is set to any other value, this also works, which allows 320x200 LowDPI games to pretend that they are running on a 320x200 display.
I like your suggestions, just one minor point: the DPIScaling variable shouldnât be global, but per screen (or maybe per window, and automatically updated when it gets moved to another screen, maybe with some kind of SDL_WINDOWEVENT_SCALECHANGED event) - because nowadays operating systems (should) support different screens having different DPIs (I know Windows does)
I think I would prefer âalways use physical pixelsâ, and then provide a scaling factor per windows that the application can request and use to do scaling itself, if it wants to match OS scaling
Approximately 10 years ago, I contributed a new API to SDL, SDL_RenderSetLogicalSize. Its purpose was quite simple: it was meant exactly to remap the rendererâs coordinates to a âvirtual windowâ with a different size in pixels, and nothing more.
This API was then taken and rewritten by somebody inside SDL to do significantly more than that by adding letterboxing to enforce that the logical resolution must preserve the physical windowâs aspect ratio, which ended up mutilating it and making it unsuitable for its original purpose. Unfortunately I didnât catch the change until after the SDL 2.0 release came out, by which point Ryan refused to even consider fixing it because someone somewhere might be using the broken version and depending on its broken behavior.
SDL_RenderSetLogicalSize has been broken ever since, and has been the source of several questions and complaints on this list over the years, by people who donât understand how it works.
Sam did tell me that heâd like to fix it in the next API revision. Well, here we are. Can we finally get a real logical size API in SDL, and move the rest of this unrelated stuff off to its own API?
A per-display value for DPIScaling makes sense. The problem with a per-window value is that it doesnât solve the problem of creating a correctly sized window in the first place. This affects not just SDL_CreateWindow but also SDL_GetDesktopDisplayMode/SDL_GetDisplayUsableBounds, which I use to select the correct window size and scaling factor.
Thereâs been some discussion and planning around improved DPI scaling support already (in the issue tracker etc). Some parts of SDL like display or display mode APIs definitely need more info exposed.
But itâs also a very complex and delicate thing to get right. For example just using a scale factor everywhere instead of querying pixel dimensions will actually cause subtle but app-breaking issues related to rounding differences of window/backbuffer sizes across operating systems and backends (try making a depth buffer for a backbuffer using the wrong OS-dependent rounding and everything will blow up). The Unity game engine has had bugs related to that in its internal code recently.
SDL 2.26.0 also got a new API to make querying the pixel dimensions of the window easier without needing to call a specific function depending on the graphics API being used.
It would be nice to see the public API replace the SintN, UintN, and SDL_bool types with their equivalent <stdint.h> and <stdbool.h> types.
Along similar lines, I would really like to see every integer field or parameter changed to an explicitly-sized type (presumably int32_t/uint32_t by default.
The former is mainly about reducing the cognitive load of (visually) parsing the API, but the latter would be particularly helpful for binding to languages where all types are explicitly sized (like C#) and languages where the size of the basic integer type varies by platform (like Go).
Are there any platforms weâll be supporting which donât have these available? Even non-c99-compliant compilers like OpenWatccom have it, and there are msinttypes and similar for older compilers without stdint.h.
And I think the plan is to make <stdbool.h> obsolete in C23, and just have âboolâ as a built-in type regardless:
This gets even more complicated when dealing with platforms that deal in logical points vs pixels to begin with. Throw in multiple, mixed-DPI displays and it gets even worse. You need to find the logical window size that will give you the desired backbuffer size so the output is as close to 1:1 as possible, but in some cases you donât know what that is as, until the window is created and shown, the display on which the window will be located is unknown, as is the scale factor.
Then there is the problem of moving things between displays: if you drag a window between a display with 1.0 scaling and 1.5 scaling, should only the backbuffer size change, or should the whole window jump in size while trying to keep the backbuffer at a fixed size? With logical points, this is left up to how the desktop environment best wants to handle it, while using physical pixels means that the window will always jump in size, even when the desktop environment was designed around preventing this behavior.
I would argue for logical points instead of physical pixels, as it alleviates these pain points and allows individual desktop environments to best handle mixed-DPI setups in the way they were natively designed to.
Something that SDL 3 could use is a âHigh-DPI Best Practicesâ guide that clearly includes everything needed for applications behave properly across platforms and not break in mixed-DPI environments. There have been DPI-related bugs reported, and it turned out the applications were using hacks to guess at some global scale value, assuming fixed-scale values, etcâŚ, which breaks in multi-monitor setups.
Neither. The correct size of my window is always the biggest integer multiple of my base resolution that fits on the current display. Move it to a bigger display, the window grows bigger. Move it to a smaller display, it gets smaller.
Since this is highly application-dependent behavior, neither SDL nor the base operating system can be expected to provide this functionality. The best thing both of them can do is to give me physical pixels and to stay out of my way. Iâll handle scaling my own way.
No it wouldnât. Separate physical and logical pixels is bad enough. Adding a third factor to deal with just makes it worse. Especially if that third factor canât even do letterboxing. Iâll do my own scaling and letterboxing, thank you very much.
Since this is highly application-dependent behavior, neither SDL nor the base operating system can be expected to provide this functionality. The best thing both of them can do is to give me physical pixels and to stay out of my way. Iâll handle scaling my own way.
I completely agree.
And Iâd add âgive me the scaling factor (based on display DPI) of the current display so I can base my own scaling on it, if I want to (make fonts and buttons bigger etc)â - and with âbase my own scaling factorâ I mean that I want to get a float and then I can write my own logic to use a scaling-factor close to that value that looks good (sharp fonts etc).
Update: Just realized: There are two kinds of scaling involved (that are related, of course): Window size and GUI elements. Even if you donât resize the window, you may (or not, dependsâŚ) want to scale text etc when moving to another display with different DPI
And note that automagically changing the window size when moving to another display would suck, because many games donât handle this (youâd have to change framebuffer sizes in OpenGL etc), because they donât support dynamically changing the window size (but you need to select a resolution resolution in a menu and apply it, which often restarts the whole renderer).