SDL2 pixel graphics pipeline

What is the pixel graphics pipeline of SDL2?

I posted detailed question to StackOverflow, but it is doesn’t matter where to reply:

I first thought about posting question here, but did than on StackOverflow, because it is
more exposed to noobz like me.

2014/1/5 techtonik

What is the pixel graphics pipeline of SDL2?

I posted detailed question to StackOverflow, but it is doesn’t matter
where to reply:
python - SDL2 pixel graphics pipeline - Stack Overflow

I first thought about posting question here, but did than on
StackOverflow, because it is
more exposed to noobz like me.

Hmm. The final pipeline (when doing hardware accelerated drawing) is that
of the backing
implementation that you end up using, be that OpenGL/Direct3D/.

Generally, you load an image from disk (eg. with SDL2_image) and it ends up
in a surface,
which is just a chunk of pixels in RAM with descriptions of the format etc.

From that
you create a texture, which is an image managed by the 3D implementation
(and so in 99%
of cases will end up in VRAM), which is then drawn to the screen using 4
textured vertices.

If you want to apply effects via direct pixel access in RAM, you will have
to go the slow
route of manipulating your surface and reuploading it to your texture. The
faster route would
be to drop SDL2’s render API alltogether and just use it to setup your
window / handle
events, then use OpenGL for the actual drawing (where you can use shader
programs that
are executed directly on the GPU).

Maybe you should try to explain why you need this information / what you’re
trying to do
under SDL2, and then we can help you accomplish that?

Have a look at the SDL_TEXTUREACCESS_STREAMING flag of SDL_CreateTexture
(http://wiki.libsdl.org/SDL_CreateTexture). By design can modify texture
data created with this flag: lock the texture with SDL_LockTexture,
modify the returned memory and unlock it again. Then draw the texture to
the screen. Some simple code examples are actually part of the SDL1.2
Migration guide (http://wiki.libsdl.org/MigrationGuide). Another
possible way might be to use the Render functions, which is what
SDL2_gfx (SDL2_gfx download | SourceForge.net) is using to draw
primitives like lines, boxes, circles, etc.On 1/5/2014 4:35 AM, techtonik wrote:

What is the pixel graphics pipeline of SDL2?

I posted detailed question to StackOverflow, but it is doesn’t matter
where to reply:
python - SDL2 pixel graphics pipeline - Stack Overflow

I first thought about posting question here, but did than on
StackOverflow, because it is
more exposed to noobz like me.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

SDL2_gfx is a bit redundant now, isn’t it?
I mean, SDL2 can now draw lines and boxes.On Mon, Jan 6, 2014 at 6:19 AM, Andreas Schiffler wrote:

Have a look at the SDL_TEXTUREACCESS_STREAMING flag of
SDL_CreateTexture (http://wiki.libsdl.org/SDL_CreateTexture). By design
can modify texture data created with this flag: lock the texture with
SDL_LockTexture, modify the returned memory and unlock it again. Then draw
the texture to the screen. Some simple code examples are actually part of
the SDL1.2 Migration guide (http://wiki.libsdl.org/MigrationGuide).
Another possible way might be to use the Render functions, which is what
SDL2_gfx (SDL2_gfx download | SourceForge.net) is using to draw
primitives like lines, boxes, circles, etc.

On 1/5/2014 4:35 AM, techtonik wrote:

What is the pixel graphics pipeline of SDL2?

I posted detailed question to StackOverflow, but it is doesn’t matter
where to reply:
python - SDL2 pixel graphics pipeline - Stack Overflow

I first thought about posting question here, but did than on
StackOverflow, because it is
more exposed to noobz like me.


SDL mailing listSDL at lists.libsdl.orghttp://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

True. SDL2_gfx is mostly there to provide API parity with SDL_gfx, but
it also provides quite a few primitives which are not included in SDL:
aaline, circle, aacircle, ellipse, aaellipse, arc, rounded rectangle,
thick line, bezier line, bitmap characters/strings.On 1/5/2014 9:51 PM, Ivan Rubinson wrote:

SDL2_gfx is a bit redundant now, isn’t it?
I mean, SDL2 can now draw lines and boxes.

On Mon, Jan 6, 2014 at 6:19 AM, Andreas Schiffler <@Andreas_Schiffler mailto:Andreas_Schiffler> wrote:

Have a look at the  SDL_TEXTUREACCESS_STREAMING flag of
SDL_CreateTexture (http://wiki.libsdl.org/SDL_CreateTexture). By
design can modify texture data created with this flag: lock the
texture with SDL_LockTexture, modify the returned memory and
unlock it again. Then draw the texture to the screen. Some simple
code examples are actually part of the SDL1.2 Migration guide
(http://wiki.libsdl.org/MigrationGuide). Another possible way
might be to use the Render functions, which is what SDL2_gfx
(http://sourceforge.net/projects/sdl2gfx/) is using to draw
primitives like lines, boxes, circles, etc.

On 1/5/2014 4:35 AM, techtonik wrote:
What is the pixel graphics pipeline of SDL2?

I posted detailed question to StackOverflow, but it is doesn't
matter where to reply:
https://stackoverflow.com/questions/20932807/sdl2-pixel-graphics-pipeline

I first thought about posting question here, but did than on
StackOverflow, because it is
more exposed to noobz like me.


_______________________________________________
SDL mailing list
SDL at lists.libsdl.org  <mailto:SDL at lists.libsdl.org>
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
_______________________________________________
SDL mailing list
SDL at lists.libsdl.org <mailto:SDL at lists.libsdl.org>
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

As another rendering option, I’m now getting SDL_gpu and its API ready for
a build release in the next month or so.

SDL_gpu is an alternative to the SDL renderer subsystem that aims to have a
full set of basic features for 2D applications with support for shaders and
direct GL calls. Help on the implementation and determining the API is
highly appreciated, by the way. :slight_smile:

Jonny D

Jonas Kulla wrote:

Hmm. The final pipeline (when doing hardware accelerated drawing) is that of the backing
implementation that you end up using, be that OpenGL/Direct3D/.

Thanks. That makes it more clear. So SDL2 supports different pipelines and if I get it right - it
selects the one that should be used automatically. Is there a way to select the pipeline myself?
I mean there should be for sure, but can I get a list of those and their components (steps)
from somewhere? I doubt that I can query SDL2 itself (ideal way if it was a Python module with
command line interface), but perhaps some docs…

I understand that SDL hides these details, but shouldn’t I care about pipeline details to make
my application work fast? I also would like to get the full picture before going into details of
every pipeline. It also helps to understand how SDL2 can be applied better (like, can I extend
SDL with more pipelines for my imaginary embedded system).

Jonas Kulla wrote:

Generally, you load an image from disk (eg. with SDL2_image) and it ends up in a surface,
which is just a chunk of pixels in RAM with descriptions of the format etc. From that
you create a texture, which is an image managed by the 3D implementation (and so in 99%
of cases will end up in VRAM), which is then drawn to the screen using 4 textured vertices.

If you want to apply effects via direct pixel access in RAM, you will have to go the slow
route of manipulating your surface and reuploading it to your texture.

If I want to program my own 2D graphics effect from scratch, I thought the fastest way is to
change values in chunk of pixels in RAM directly. Why can’t I write them to VRAM directly?

Also, what is the format of this chunk of pixels? I would like to use old-school VGA palette,
because drawing algorithm uses values 0…255 for every color, not RGBA (which I think is the
format for textures). Where on pipeline it is possible to go from palette → RGBA? Should I
care about this or this step is better to leave it accelerated by SDL? Does it accelerate
pixel format conversion? Does it use special CPU commands for that?

While trying to go from theory to practice I’ve come upon Renderer concept. It is clearly
a part of pipeline, but I can’t completely get it. What is the role of the Renderer in SDL2?

Jonas Kulla wrote:

The faster route would be to drop SDL2’s render API alltogether and just use it to setup your
window / handle events, then use OpenGL for the actual drawing (where you can use shader
programs that are executed directly on the GPU).

Do you mean that I can provide these “pre-rendered” texture vertices from the start? Like
if I can set pixels in any format, I can move from
[surface] -> [texture] -> [vertices] -> [screen] to just [vertices] -> [screen]?

I am not sure that I want to go OpenGL way right now. I’ve tried to get how it works with
Python pyglet library, but the complexity appeared too high to munch on. I just need to
put some 2D stuff to the screen. If FPS is high - it is a big bonus, but not the goal. The goal
is improved user experience for developers and portability - see below.

Jonas Kulla wrote:

Maybe you should try to explain why you need this information / what you’re trying to do
under SDL2, and then we can help you accomplish that?

I want to program a demo (effect) in Python. Preferably in pure Python, but operating
systems do not provide convenient portable interface for audio/video, so I am exploring
SDL2. This is an experiment to evaluate current progress with doing dynamic 2D graphics
in scripting languages. If performance and overall user experience will be cool, I may find
motivation to explore ideal 2D API that I will eventually remember and start using.

This ideal API is called “Canvas 2D Library”. The problem here is that there are many
drawing libraries out there - Skia, Processing, … with incompatible API, and there are
many incompatible drawing scripts as a result. Ideal API is impossible, but ideal “Canvas
2D Library” that just ship all these APIs together is still real (given that I find the way not
to bury in this initiative alone).

When the ideal set of Canvas 2D API is ready, some of it parts can be then ported to other
platforms - such as web, embedded systems, exotic hardware etc. This is the faraway goal.

Thanks again. Your input is valuable for a newbies like me. I think that I now understand
SDL2 much better - and that’s without regressing into C implementation details, which is
very awesome.

@Andreas: Regarding SDL_TEXTUREACCESS_STREAMING - is that “returned memory” is just a pointer or the memory contents travels between real and video mem? While I do need previous pixel data, it is already in my memory buffer, so won’t reading pixels back to overwrite affect performance?

@Jonny D: SDL_gpu looks interesting, but while I don’t completely understand the role of SDL_Renderer, it is hard to understand more features of SDL_gpu as well. Basically, because I fail to completely see Renderer pipeline, I can not compare how SDL_gpu fits in and how it is worse or better. In particular I fear that it won’t be able to start on a crippled T400 with integrated Intel graphics. I don’t know what it supports, but WebGL in Chrome doesn’t work.

SDL is meant to expose cross-platform multimedia functionality. After you
create a window with SDL, the renderer system presents you with an
interface that draws images and manipulates pixels on that window.
Renderers are an abstraction of the various ways you can get pixels on the
screen, such as OpenGL, Direct3D, OpenGL ES, or DirectFB, so you don’t have
to learn all of those APIs (and several of which are hardware-accelerated).
Check out the src/render directory in the SDL source for the details.

I don’t know how “crippled” your T400 is, but SDL_gpu is meant to build and
run on some old hardware with its OpenGL 1.1 renderer. SDL’s own renderers
(in particular the software renderer) will run on practically everything.

If you’re trying to get a feel for how SDL’s rendering system works, you
should use it. Look at the demos in the source repository or look up some
tutorials. That’s certainly the quickest way.

Jonny DOn Fri, Jan 10, 2014 at 5:56 PM, techtonik wrote:

@Andreas: Regarding SDL_TEXTUREACCESS_STREAMING - is that “returned
memory” is just a pointer or the memory contents travels between real and
video mem? While I do need previous pixel data, it is already in my memory
buffer, so won’t reading pixels back to overwrite affect performance?

@Jonny D: SDL_gpu looks interesting, but while I don’t completely
understand the role of SDL_Renderer, it is hard to understand more features
of SDL_gpu as well. Basically, because I fail to completely see Renderer
pipeline, I can not compare how SDL_gpu fits in and how it is worse or
better. In particular I fear that it won’t be able to start on a crippled
T400 with integrated Intel graphics. I don’t know what it supports, but
WebGL in Chrome doesn’t work.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

It is not possible to read VRAM via SDL.
The returned memory is a pointer to System RAM which is privately
allocated by SDL.On 1/11/2014 4:26 AM, techtonik wrote:

@Andreas: Regarding SDL_TEXTUREACCESS_STREAMING - is that “returned
memory” is just a pointer or the memory contents travels between real
and video mem? While I do need previous pixel data, it is already in
my memory buffer, so won’t reading pixels back to overwrite affect
performance?

@Jonny D: SDL_gpu looks interesting, but while I don’t completely
understand the role of SDL_Renderer, it is hard to understand more
features of SDL_gpu as well. Basically, because I fail to completely
see Renderer pipeline, I can not compare how SDL_gpu fits in and how
it is worse or better. In particular I fear that it won’t be able to
start on a crippled T400 with integrated Intel graphics. I don’t know
what it supports, but WebGL in Chrome doesn’t work.


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


Pallav Nawani
Game Designer/CEO

Twitter: x.com
Facebook: Ironcode Gaming

Message-ID: <1389393506.m2f.41438 at forums.libsdl.org>
Content-Type: text/plain; charset=“iso-8859-1”

Jonas Kulla wrote:

Hmm. The final pipeline (when doing hardware accelerated drawing) is that
of the backing
implementation that you end up using, be that OpenGL/Direct3D/.

Thanks. That makes it more clear. So SDL2 supports different pipelines and
if I get it right - it
selects the one that should be used automatically. Is there a way to select
the pipeline myself?

Yes.

I mean there should be for sure, but can I get a list of those and their
components (steps)
from somewhere?

You can get a list of the available backends at runtime from SDL2,
look at the wiki (I have some things to do, so I’m not going to gather
up links this time). As a general rule the software backend SHOULD
always be available; and either DirectX, 1+ variants of OpenGL, or a
mixture of those two, will ALMOST always be available. You need to get
the list before allocating the renderer in order to actually benefit
from it, but once you have the info you can use it to indicate the
correct backend.

I doubt that I can query SDL2 itself (ideal way if it was a
Python module with
command line interface), but perhaps some docs…

Certainly you can’t just get it to spit out a list of the backends on
your machine since it’s a dynamic library, but I’d be surprised if
there wasn’t a tool stuffed somewhere to list your machine’s backends.
Even if one of those doesn’t exist either, it should still be fairly
simple: just print a copy of the list into a file.

I understand that SDL hides these details, but shouldn’t I care about
pipeline details to make
my application work fast?

SDL doesn’t really try to hide stuff too hard, it tries more to even
out platform variances so that a given piece of code can run almost
identically in multiple environments. This can accidentally result in
details being hidden, but it’s not really intentional (unless those
are details of SDL’s implementation, in which case they might change
unexpectedly).

It also helps to understand how SDL2 can be applied better
(like, can I extend
SDL with more pipelines for my imaginary embedded system).

This could be done, yes; however, you’d need to write at the very
least some glue code in C.

Jonas Kulla wrote:

Generally, you load an image from disk (eg. with SDL2_image) and it ends
up in a surface,
which is just a chunk of pixels in RAM with descriptions of the format
etc. From that
you create a texture, which is an image managed by the 3D implementation
(and so in 99%
of cases will end up in VRAM), which is then drawn to the screen using 4
textured vertices.

If you want to apply effects via direct pixel access in RAM, you will have
to go the slow
route of manipulating your surface and reuploading it to your texture.

If I want to program my own 2D graphics effect from scratch, I thought the
fastest way is to
change values in chunk of pixels in RAM directly. Why can’t I write them to
VRAM directly?

Because this isn’t the 80s and early 90s, so 99 times out of 100
you’ll have to contend with a GPU. That having been said, there is a
modern way to do what you’re talking about, but it requires using
“shader programs”, and SDL doesn’t provide an abstraction for those
(an abstraction would basically require that a translating compiler be
included as part of SDL2: not happening). For shaders you need to
customize for the underlying target, while realizing that sometimes
the underlying target just won’t support them at all.

Also, what is the format of this chunk of pixels?

It varies. For SDL’s surfaces there are multiple formats supported.
For video memory it’ll MOSTLY depend on the graphics card (even the
actual GPU processor might not be the deciding factor, if a
manufacturer customized their drivers and/or firmware).

I would like to use
old-school VGA palette,
because drawing algorithm uses values 0…255 for every color, not RGBA
(which I think is the
format for textures). Where on pipeline it is possible to go from palette
→ RGBA?

This can happen at multiple places. SDL’s texture code can do the job
if I remember right, and Ryan once or twice posted some shader code to
do the same job.

Should I
care about this or this step is better to leave it accelerated by SDL?

Depends on how you feel like implementing it.

Does
it accelerate
pixel format conversion? Does it use special CPU commands for that?

It MIGHT use some optimized code for the process, but communicating
with a graphics card is normally restrained by the speed of the
PCI/AGP/PCIe/etc. bus, instead of by the CPU speed, so this isn’t
likely to be important.

While trying to go from theory to practice I’ve come upon Renderer concept.
It is clearly
a part of pipeline, but I can’t completely get it. What is the role of the
Renderer in SDL2?

It wraps around the hairy details of the underlying accelerated
graphics system, so that you don’t have to deal with those details, OR
with disonances between different backends.

Jonas Kulla wrote:

The faster route would be to drop SDL2’s render API alltogether and just
use it to setup your
window / handle events, then use OpenGL for the actual drawing (where you
can use shader
programs that are executed directly on the GPU).

Do you mean that I can provide these “pre-rendered” texture vertices from
the start? Like
if I can set pixels in any format, I can move from
[surface] -> [texture] -> [vertices] -> [screen] to just [vertices] -> [screen]?

I am not sure that I want to go OpenGL way right now. I’ve tried to get how
it works with
Python pyglet library, but the complexity appeared too high to munch on. I
just need to
put some 2D stuff to the screen. If FPS is high - it is a big bonus, but not
the goal. The goal
is improved user experience for developers and portability - see below.

If you don’t really care about FPS then you might try starting off
with the software renderer.

Jonas Kulla wrote:

Maybe you should try to explain why you need this information / what
you’re trying to do
under SDL2, and then we can help you accomplish that?

I want to program a demo (effect) in Python.

Is this a “demoscene” demo? I would assume software mode to be what
you REALLY want if so.> Date: Fri, 10 Jan 2014 22:38:26 +0000

From: “techtonik”
To: sdl at lists.libsdl.org
Subject: Re: [SDL] SDL2 pixel graphics pipeline

Date: Fri, 10 Jan 2014 22:56:10 +0000
From: “techtonik”
To: sdl at lists.libsdl.org
Subject: Re: [SDL] SDL2 pixel graphics pipeline
Message-ID: <1389394570.m2f.41439 at forums.libsdl.org>
Content-Type: text/plain; charset=“iso-8859-1”

@Andreas: Regarding SDL_TEXTUREACCESS_STREAMING - is that “returned memory”
is just a pointer or the memory contents travels between real and video mem?

For the software render it’s (at least in the case of normal pixel
formats) a pointer to where the data already was, for accelerated it’s
best to assume that it had to be copied out over a data bus to get
into main memory, and will have to be copied back when you’re done.
There CAN BE exceptions, but they’re likely to come with reduced
performance.

While I do need previous pixel data, it is already in my memory buffer, so
won’t reading pixels back to overwrite affect performance?

Yes. If you’re keeping the data already then shun this data copy like
the plague, as AT BEST you’ll get a trivial O( 1 ) hit every time, and
at worst you’ll get a more meaningful slowdown.

@Jonny D: SDL_gpu looks interesting, but while I don’t completely understand
the role of SDL_Renderer, it is hard to understand more features of SDL_gpu
as well. Basically, because I fail to completely see Renderer pipeline, I
can not compare how SDL_gpu fits in and how it is worse or better.

SDL_Renderer implements a few basic operations using whatever native
interfaces that it can detect. It’s mostly useful for either very
basic uses (e.g. rendering a surface that you’ve already prepared to a
window), or as a way to start OpenGL or DirectX without having to put
any effort into it. SDL_gpu I haven’t touched.

Thanks for replies. The thread is long, but signal to noise ratio is very high.

While there are some missing details (like palette to value transformation), but
I managed to create this :
https://bitbucket.org/techtonik/discovery/src/tip/graphics/pysdl2/?at=default

On Windows just checkout, run ./bootstrap.py to fetch dependencies and then
./demofire.py

The name HellFire was chosen for Hello World fire effect, but in reality appeared
to mean demo that runs slow as hell. =)

While I am digesting all the details in this thread, there is one question I
currently can’t answer.

Why can’t I have two renderers for the same window? I am not sure if it is SDL2
limitation, but it allows me to create two, but when I try to draw on the second,
it fails with sdl2.ext.common.SDLError: ‘Invalid renderer’

Aside from the question, here is the explanation why I need it. Experimenting
produces unexpected outcomes and I save them as “scenes”, which I can cycle
through with right/left arrows. When a scene is switched, I want to preserve its
current picture, which as I understand is saved by renderer. I draw every scene
incrementally without renderer.clear(), and I can not just redraw it from scratch.

2014/1/13 techtonik

While I am digesting all the details in this thread, there is one question
I
currently can’t answer.

Why can’t I have two renderers for the same window? I am not sure if it is
SDL2
limitation, but it allows me to create two, but when I try to draw on the
second,
it fails with sdl2.ext.common.SDLError: ‘Invalid renderer’

Aside from the question, here is the explanation why I need it.
Experimenting
produces unexpected outcomes and I save them as “scenes”, which I can cycle
through with right/left arrows. When a scene is switched, I want to
preserve its
current picture, which as I understand is saved by renderer. I draw every
scene
incrementally without renderer.clear(), and I can not just redraw it from
scratch.

As was already mentioned, a “renderer” is nothing more than an abstraction
over
a native (possibly HW accelerated) graphics library, such as OpenGL. You
use it
to draw things to your window. The renderer doesn’t preserve any of the
things it
draws, only state such as blendmode etc. that defines how things are to be
drawn.

To preserve the pixels you draw across screen clears, you would have to
render
to a texture instead, and then draw it to the screen. See here:
http://wiki.libsdl.org/SDL_SetRenderTarget

Create a window-sized texture for every “scene” you have, render to them
with
the above function, then at the end set the render target to NULL (which
means
the actual window) and draw the scene texture to the screen.

Just for the reference I find this stackoverflow answer very useful to get the role of Renderer:

Jonas Kulla wrote:

As was already mentioned, a “renderer” is nothing more than an abstraction over
a native (possibly HW accelerated) graphics library, such as OpenGL. You use it
to draw things to your window. The renderer doesn’t preserve any of the things it
draws, only state such as blendmode etc. that defines how things are to be drawn.

To preserve the pixels you draw across screen clears, you would have to render
to a texture instead, and then draw it to the screen. See here:
http://wiki.libsdl.org/SDL_SetRenderTarget (http://wiki.libsdl.org/SDL_SetRenderTarget)

Create a window-sized texture for every “scene” you have, render to them with
the above function, then at the end set the render target to NULL (which means
the actual window) and draw the scene texture to the screen.

So that means video memory will be copied back to memory buffer on every
renderer operation, right?

It looks like an awful way of doing things. Is it possible to use renderer to draw in
memory buffer and push it to video memory on renderer.present()?

I am not sure how drawing to texture is better than just maintaining my own
buffer in memory. Not sure if renderer allows to draw in memory, but if renderer
is the only way to modify the texture - what’s the point in copying it from video
memory every time instead of maintaining same buffers in ordinary memory?

If you’re using a HW-accelerated renderer, then SDL_Texture is mostly just
a handle for texture memory on the GPU. When you use the rendering APIs,
all the work is done on the GPU and there is no copying of video (GPU)
memory to system (RAM) memory. That’s why this is preferred over the old
SDL_Surface way from SDL 1.2.

You can still use an SDL_Surface and then copy it to an SDL_Texture in
order to get it on the screen, but that is the operation that transfers the
pixel data between video and system memory.

Jonny DOn Thu, Mar 27, 2014 at 7:52 AM, techtonik wrote:

Just for the reference I find this stackoverflow answer very useful to
get the role of Renderer:

c++ - What is an SDL renderer? - Stack Overflow

Jonas Kulla wrote:

As was already mentioned, a “renderer” is nothing more than an abstraction
over
a native (possibly HW accelerated) graphics library, such as OpenGL. You
use it
to draw things to your window. The renderer doesn’t preserve any of the
things it
draws, only state such as blendmode etc. that defines how things are to be
drawn.

To preserve the pixels you draw across screen clears, you would have to
render
to a texture instead, and then draw it to the screen. See here:
http://wiki.libsdl.org/SDL_SetRenderTarget

Create a window-sized texture for every “scene” you have, render to them
with
the above function, then at the end set the render target to NULL (which
means
the actual window) and draw the scene texture to the screen.

So that means video memory will be copied back to memory buffer on every
renderer operation, right?

It looks like an awful way of doing things. Is it possible to use renderer
to draw in
memory buffer and push it to video memory on renderer.present()?

I am not sure how drawing to texture is better than just maintaining my own
buffer in memory. Not sure if renderer allows to draw in memory, but if
renderer
is the only way to modify the texture - what’s the point in copying it
from video
memory every time instead of maintaining same buffers in ordinary memory?


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org