Trying to grasp new rendering model

Hi,

I’m currently trying to get into the details of the new rendering model.
I haven’t found any documentation of the details except the code. If
there is some documentation please redirect me and I’ll try to solve the
questions myself.

Here is a summery of how I understand SDL 1.3 right now:

  1. layer: The video driver.
  • Allows you to handle windows (create/move/delete them, also create
    fullscreen ones)
  • Allows to create textures in a desired pixel format which are possibly
    managed by hardware under the hood
  1. layer: The Renderer, manages the hardware abstraction. In the Win32
    case, currently GDI and D3D are available.

I’m trying to adopt the GAPI code to this new model, and the following
questions came up:

  • In SDL_compat I found the following code:

/* Create a texture for the screen surface */
SDL_VideoTexture = SDL_CreateTexture(desired_format,
SDL_TEXTUREACCESS_STREAMING, width, height);

How do you know that this returns the the ScreenSurface? Isn’t a texture
an arbitrary image, handled by the renderer, but not necessarily
connected to the Screen (analog to a open gl texture)?

  • There are the renderers GL/GLES/SW which are not directly connected
    with a video driver. How are they used? Can I create a Win32 video
    driver utilizing the GL renderer?

  • As GAPI basically only gives access to video memory, I am looking for
    a way to use existing software rendering code paths. So all boils down
    to the question, how to create a renderer which uses the texture
    handling of SDL_renderer_sw, and provides a single texture which maps to
    the Screens video memory?

I think my biggest problem is, that I don’t see the connection between a
SDL_texture and the pixel buffer of the Screen/Window. I also don’t
understand the difference between a SDL_Texture and a SDL_Surface. Is a
texture just a potentially hw accelerated surface?

I would be really happy if someone could shed some light on this.

Thanks a lot
Stefan

Hi,

I’m currently trying to get into the details of the new rendering model. I haven’t found any documentation of the details except the code. If there is some documentation please redirect me and I’ll try to solve the questions myself.

Try looking here: http://www.libsdl.org/tmp/SDL-1.3-docs/

It’s not complete, but it’s fairly decent.>----- Original Message ----

From: Stefan Klug <klug.stefan at gmx.de>
Subject: [SDL] Trying to grasp new rendering model

Mason Wheeler schrieb:>> ----- Original Message ----

From: Stefan Klug <@Stefan_Klug>
Subject: [SDL] Trying to grasp new rendering model

Hi,

I’m currently trying to get into the details of the new rendering model. I haven’t found any documentation of the details except the code. If there is some documentation please redirect me and I’ll try to solve the questions myself.

Try looking here: http://www.libsdl.org/tmp/SDL-1.3-docs/

Thanks for this link, I didn’t know that page. But these docs are the
doxygen ones automatically generated from the SDL headers, which I’ve
read already. I was looking for docs beside that.

Stefan

I’m currently trying to get into the details of the new rendering model.
I haven’t found any documentation of the details except the code. If
there is some documentation please redirect me and I’ll try to solve the
questions myself.

There isn’t any yet, and these are great questions. I’ll answer them here.

Here is a summery of how I understand SDL 1.3 right now:

  1. layer: The video driver.
  • Allows you to handle windows (create/move/delete them, also create
    fullscreen ones)
  • Allows to create textures in a desired pixel format which are possibly
    managed by hardware under the hood
  1. layer: The Renderer, manages the hardware abstraction. In the Win32
    case, currently GDI and D3D are available.

Yes, that’s correct.

I’m trying to adopt the GAPI code to this new model, and the following
questions came up:

  • In SDL_compat I found the following code:

/* Create a texture for the screen surface */
SDL_VideoTexture = SDL_CreateTexture(desired_format,
SDL_TEXTUREACCESS_STREAMING, width, height);

How do you know that this returns the the ScreenSurface? Isn’t a texture
an arbitrary image, handled by the renderer, but not necessarily
connected to the Screen (analog to a open gl texture)?

It is. In this case SDL_compat.c is creating an arbitrary texture that it
will use to represent the traditional “screen surface”.

  • There are the renderers GL/GLES/SW which are not directly connected
    with a video driver. How are they used? Can I create a Win32 video
    driver utilizing the GL renderer?

If the video driver supports creating an OpenGL context, the OpenGL
renderer will work. Take a look at SDL_win32opengl.c for an example.

  • As GAPI basically only gives access to video memory, I am looking for
    a way to use existing software rendering code paths. So all boils down
    to the question, how to create a renderer which uses the texture
    handling of SDL_renderer_sw, and provides a single texture which maps to
    the Screens video memory?

There’s a good example of this in the “dummy” renderer (SDL_nullrender.c).

I think my biggest problem is, that I don’t see the connection between a
SDL_texture and the pixel buffer of the Screen/Window. I also don’t
understand the difference between a SDL_Texture and a SDL_Surface. Is a
texture just a potentially hw accelerated surface?

Yes, a texture is just a potentially hw accelerated surface. In the new
video model the front buffer isn’t directly accessible. All composition
is done by drawing or blitting textures onto either a back buffer or
directly to the front buffer (using SDL_RENDERER_SINGLEBUFFER)

I would be really happy if someone could shed some light on this.

Does that help? Feel free to ask as many questions as you want. :slight_smile:

See ya!
-Sam Lantinga, Founder and President, Galaxy Gameworks LLC

Yes, a texture is just a potentially hw accelerated surface. In the new
video model the front buffer isn’t directly accessible. All composition
is done by drawing or blitting textures onto either a back buffer or
directly to the front buffer (using SDL_RENDERER_SINGLEBUFFER)

To expand on this a little, SDL_compat.c creates a single large texture
that it uses as the screen surface and then uses the current renderer to
copy it to the visible display. Fortunately this is exactly a generalization
of how SDL 1.2 works under the hood, so the speed is roughly equivalent.

The advantage of the new model is that you aren’t required to make a
full screen texture. You can instead create a number of smaller textures
and use them along with rect and line drawing to compose your scene, which
lends itself well to being hardware accelerated.

Anyone who uses the older style 1.2 API continues to get access to a
framebuffer interface at normal CPU blitting speed and anyone who uses
the newer API can get hardware accelerated rendering.

See ya!
-Sam Lantinga, Founder and President, Galaxy Gameworks LLC

Sam Lantinga schrieb:

Yes, a texture is just a potentially hw accelerated surface. In the new
video model the front buffer isn’t directly accessible. All composition
is done by drawing or blitting textures onto either a back buffer or
directly to the front buffer (using SDL_RENDERER_SINGLEBUFFER)

To expand on this a little, SDL_compat.c creates a single large texture
that it uses as the screen surface and then uses the current renderer to
copy it to the visible display. Fortunately this is exactly a generalization
of how SDL 1.2 works under the hood, so the speed is roughly equivalent.

The advantage of the new model is that you aren’t required to make a
full screen texture. You can instead create a number of smaller textures
and use them along with rect and line drawing to compose your scene, which
lends itself well to being hardware accelerated.

Anyone who uses the older style 1.2 API continues to get access to a
framebuffer interface at normal CPU blitting speed and anyone who uses
the newer API can get hardware accelerated rendering.

Thanks this really helped a lot…
I already had a look at SDL_nullrender before, but I overlooked
Setup_SoftwareRenderer();

Now everything really makes sense :wink:
I don’t know if I feel totally comfortable without direct screen buffer
access, but yes its ok.

One last thing, just to be sure ;-). With the new model, there is some
neat generalization in place together with a fallback for the renderer
flags.
ie. If I want to have a render with SDL_RENDERER_PRESENTFLIP3 but my
Video Driver doesn’t support this, the SW_Driver will kick in and
emulate triple buffering.

But as far as I’ve seen there are no fallbacks for texture formats,
blend modes etc.
Here an imaginary example:
On a target machine I got some render driver with hardware acceleration
but no blending support.
Now if I use blending the renderer will fail and I have to provide the
fallback myself…?
So either the application developer has to check every renderer in
advance for the capabilities, or always force the software driver?

The main question for me is:
Should a render driver fail in any situation it can’t handle, possibly
forcing the users to the SW driver? Or should every render driver try to
expose all possibilities (blending, clipping etc.) using its own
possibly bad implemented fallbacks?
Or is there any SDL-global fallback strategy available or planned?

From a users point of view I would expect SDL to handle all cases,
propably issuing a warning if something forces a non-HW path.

Cheers
Stefan

The main question for me is:
Should a render driver fail in any situation it can’t handle, possibly
forcing the users to the SW driver? Or should every render driver try to
expose all possibilities (blending, clipping etc.) using its own
possibly bad implemented fallbacks?
Or is there any SDL-global fallback strategy available or planned?

From a users point of view I would expect SDL to handle all cases,
propably issuing a warning if something forces a non-HW path.

Here a really concrete example :wink:
Should the gapi driver be implemented with software fallbacks like the
Dummy driver, which allows the user to do

SetEnv(SDL_VIDEO_RENDERER,“gapi”)
…create renderer and do everything…

or should I implement a slick minimal renderer which would force the
user to do

SetEnv(SDL_VIDEO_RENDERER,“software”)
SetEnv(SDL_VIDEO_RENDERER_SWDRIVER,“gapi”)
…create renderer and do everything…

which basically would lead to the same results but requires more
knowledge on the user side.

What’s the “SDL” way to do it?

Cheers Stefan

/* Create a texture for the screen surface */
SDL_VideoTexture = SDL_CreateTexture(desired_format,
SDL_TEXTUREACCESS_STREAMING, width, height);

How do you know that this returns the the ScreenSurface? Isn’t a texture
an arbitrary image, handled by the renderer, but not necessarily
connected to the Screen (analog to a open gl texture)?

It is. In this case SDL_compat.c is creating an arbitrary texture that it
will use to represent the traditional “screen surface”.

I don’t know if I feel totally comfortable without direct screen buffer
access, but yes its ok.

Direct framebuffer access is useful for a variety of special effects,
but so-called “shader programs” are the future for these techniques.
Failing software emulation of GLSL in SDL, perhaps the FBO GL
extension could be used instead of creating a texture that acts like a
framebuffer?On Tue, Feb 10, 2009 at 9:51 AM, Sam Lantinga wrote:
On Tue, Feb 10, 2009 at 11:35 AM, Stefan Klug <klug.stefan at gmx.de> wrote:


http://codebad.com/

Den Tue, 10 Feb 2009 13:05:09 -0500
skrev Donny Viszneki <donny.viszneki at gmail.com>:

Failing software emulation of GLSL in SDL, perhaps the FBO GL
extension could be used instead of creating a texture that acts like a
framebuffer?

Speaking of OpenGL and GLSL, is it possible to mix opengl calls with
SDL calls in SDL 1.3 (assuming the opengl driver is used, of course)?
And can custom GLSL programs be enabled for use with SDL_RenderCopy
etc? (if not, at least a way to get the opengl texture id from
SDL_Texture, so the texture can be modified using straight opengl?)

Also, for window management, I don’t see a way to make windows “always
on top”, which could be useful for some things. Being able to
select “tool style” window decorations would be nice too (slimmer
titlebar, eg like the toolboxes in gimp).

Oh, and maybe a way to make SDL_SetWindowPosition not generate motion
events? (Was fiddling around a bit with moving windows with by
clicking and dragging inside them. Disabling events for the move kinda
makes it work, but it’s a little jittery, and the mouse still moves
around a bit and can lose its hold, even if using WarpMouseInWindow.)

Also, why is the window position set by SDL_SetWindowPosition restricted
to keep the window completely within the desktop area? Is it KDE
messing things up again? (If this is possible to fix, please do…)

And, while I’m at it (heh), any plans for supporting windows with an
alpha channel? Wouldn’t be supported everywhere obviously, but at least
Windows, Mac and Linux have composited desktops now.

  • Gerry

Den Tue, 10 Feb 2009 13:05:09 -0500
skrev Donny Viszneki <donny.viszneki at gmail.com>:

Failing software emulation of GLSL in SDL, perhaps the FBO GL
extension could be used instead of creating a texture that acts like a
framebuffer?

Speaking of OpenGL and GLSL, is it possible to mix opengl calls with
SDL calls in SDL 1.3 (assuming the opengl driver is used, of course)?

Yes, if you are careful to maintain the correct OpenGL state.

And can custom GLSL programs be enabled for use with SDL_RenderCopy
etc? (if not, at least a way to get the opengl texture id from
SDL_Texture, so the texture can be modified using straight opengl?)

Not at the moment, but if you look at it and see a good way to do that,
I’m open to suggestions.

Also, for window management, I don’t see a way to make windows “always
on top”, which could be useful for some things. Being able to
select “tool style” window decorations would be nice too (slimmer
titlebar, eg like the toolboxes in gimp).

Always on top is not a bad idea, I’ll add it to the list.

Oh, and maybe a way to make SDL_SetWindowPosition not generate motion
events? (Was fiddling around a bit with moving windows with by
clicking and dragging inside them. Disabling events for the move kinda
makes it work, but it’s a little jittery, and the mouse still moves
around a bit and can lose its hold, even if using WarpMouseInWindow.)

You mean mouse motion events?

Also, why is the window position set by SDL_SetWindowPosition restricted
to keep the window completely within the desktop area? Is it KDE
messing things up again? (If this is possible to fix, please do…)

It’s not, KDE may be doing something there.

And, while I’m at it (heh), any plans for supporting windows with an
alpha channel? Wouldn’t be supported everywhere obviously, but at least
Windows, Mac and Linux have composited desktops now.

I’ve thought about it a bit. The most obvious way to do this would be to
expose video modes that have an alpha channel. I haven’t fleshed the idea
completely out yet.

See ya!
-Sam Lantinga, Founder and President, Galaxy Gameworks LLC

The main question for me is:
Should a render driver fail in any situation it can’t handle, possibly
forcing the users to the SW driver? Or should every render driver try to
expose all possibilities (blending, clipping etc.) using its own
possibly bad implemented fallbacks?
Or is there any SDL-global fallback strategy available or planned?

The general idea is that the drivers should handle all the cases that can
be accelerated, and the application requests any renderer that has the
capabilities that it needs, and this may end up being the software renderer
if the hardware driver has limited capabilities.

This isn’t finalized yet, especially with regards to texture formats, but
that’s the general idea.

See ya!
-Sam Lantinga, Founder and President, Galaxy Gameworks LLC

or should I implement a slick minimal renderer which would force the
user to do

SetEnv(SDL_VIDEO_RENDERER,“software”)
SetEnv(SDL_VIDEO_RENDERER_SWDRIVER,“gapi”)

In theory you shouldn’t have to set any environment variables. The gapi
driver should come before the software driver in the renderer list. Then,
when the application calls SDL_CreateRenderer(window, -1, flags), it will
pick the most accelerated renderer that supports the given features.

If an application has more specific needs or wants a specific driver, it
can pass a valid index in to get a specific driver.

See ya!
-Sam Lantinga, Founder and President, Galaxy Gameworks LLC