Does CreateTextureFromSurface imply SDL_TEXTUREACCESS_STATIC

Joseph Carter wrote:

When in doubt, submit some! :wink:

Yeah, again, that doesn’t help people who’re trying to learn, only people who’ve already learned. I’m not going to explain that catch-22 again.

Oh, I get it. You’re talking to a guy who until about two weeks ago
hadn’t touched a compiler in over a decade. How much of my code do
you think even compiled the first time I attempted it? :slight_smile:

This is actually why I’ve been spending so much time with the
joystick code, actually. I used it to re-teach myself C. Two weeks
later, I have a pretty deep understanding of this code. SDL is
neither a black box nor a magic talisman. Its workings only seem
complex (particularly in the video code) because it’s written to
handle general cases rather than specific ones.

There are really two things you need to be able to make sense of
SDL’s video code above the implementation-specific drivers. They are
the pixel format, and the whole issue of hardware vs. software
surfaces and textures. It turns out that knowing a little something
about modern video hardware and drivers greatly helps here. :slight_smile:

Let’s start with pixel format. As you know, there are red, green,
blue, and alpha channels out there. SDL_Surface is capable of
representing literally any possible combination of these things. Far
more than you’ll ever use or any modern graphics driver is actually
going to support.

Here’s the first gotcha: Endianness. Most modern CPUs are little
endian. That means if you write an int 0x12345678 and read it an
array of four bytes, it’ll be 0x78, 0x56, 0x34, 0x12 in that order.
This obviously is going to matter when doing graphics because a pixel
on modern systems is typically 32 bits and so we tend to operate on
them using 32 bit ints rather than 8 bit bytes.

And you run smack into a difference between OpenGL and Direct3D right
off the bat. OpenGL is made to be platform-independent. GL_RGBA and
GL_BGRA, when operating on 32 bit pixel data, literally mean the
bytes are in that order. On Direct3D, everything is “ARGB”. But
that refers to 0xARGB on a little-endian system. Don’t ask me to
explain what this means for Direct3D on the XBox 360 where the CPU is
PowerPC-based. No idea really. :slight_smile: But on x86 and ARM, it works as
above and the OpenGL equivalent to Direct3D’s “ARGB” is GL_BGRA. :slight_smile:

And lo, suddenly a lot of SDL’s complexity surrounding pixel formats
is explained. :slight_smile: You’ll find that SDL 2.0 seems to have cleaned up
some of this. It now has the expectation that you’re actually going
to use a something sane and even has routines tell you if a surface
(or texture) is actually in a sane format, and which one if so. :slight_smile:

Then there are hardware and software surfaces and textures. As you
already know, video cards are 3D nowadays. And 3D drivers offload
most of the graphics processing to the graphics processor on the
video card. On modern OSes, this is done even for 2D video if
possible (Windows Vista/7/8, Mac OS X 10.3+, newer window managers
and desktop environments under *nix?)

To make this work as well as it can, you want to cache as much of
your graphics data (textures) in the video card’s memory as possible
because it can draw to the frame buffer faster than you can, usually
without taking up valuable time on the system bus to do it. You can
obviously update the contents of any texture, but your renderer is
running a 3D backend and that backend supports a special mode for
streaming textures, it gets faster than using generic code involving
glTexSubImage2D or equivalent.

Knowing that should help you make sense of SDL’s renderer which
doesn’t seem terribly complex until you get into the backends. And
if needed, you cal always look at the software backend for code you
should be able to follow.

Hope that gives you a place to start. :slight_smile:

JosephOn Sun, Oct 13, 2013 at 10:15:07AM +0000, mattbentley wrote:

When in doubt, submit some! :wink:

Yeah, again, that doesn’t help people who’re trying to learn, only people who’ve already learned. I’m not going to explain that catch-22 again.

Joseph Carter wrote:

Oh, I get it. You’re talking to a guy who until about two weeks ago
hadn’t touched a compiler in over a decade. How much of my code do
you think even compiled the first time I attempted it? :slight_smile:

This is actually why I’ve been spending so much time with the
joystick code, actually. I used it to re-teach myself C. Two weeks
later, I have a pretty deep understanding of this code. SDL is
neither a black box nor a magic talisman. Its workings only seem
complex (particularly in the video code) because it’s written to
handle general cases rather than specific ones.

There are really two things you need to be able to make sense of
SDL’s video code above the implementation-specific drivers. They are
the pixel format, and the whole issue of hardware vs. software
surfaces and textures. It turns out that knowing a little something
about modern video hardware and drivers greatly helps here. :slight_smile:

Let’s start with pixel format. As you know, there are red, green,
blue, and alpha channels out there. SDL_Surface is capable of
representing literally any possible combination of these things. Far
more than you’ll ever use or any modern graphics driver is actually
going to support.

Here’s the first gotcha: Endianness. Most modern CPUs are little
endian. That means if you write an int 0x12345678 and read it an
array of four bytes, it’ll be 0x78, 0x56, 0x34, 0x12 in that order.
This obviously is going to matter when doing graphics because a pixel
on modern systems is typically 32 bits and so we tend to operate on
them using 32 bit ints rather than 8 bit bytes.

And you run smack into a difference between OpenGL and Direct3D right
off the bat. OpenGL is made to be platform-independent. GL_RGBA and
GL_BGRA, when operating on 32 bit pixel data, literally mean the
bytes are in that order. On Direct3D, everything is “ARGB”. But
that refers to 0xARGB on a little-endian system. Don’t ask me to
explain what this means for Direct3D on the XBox 360 where the CPU is
PowerPC-based. No idea really. :slight_smile: But on x86 and ARM, it works as
above and the OpenGL equivalent to Direct3D’s “ARGB” is GL_BGRA. :slight_smile:

And lo, suddenly a lot of SDL’s complexity surrounding pixel formats
is explained. :slight_smile: You’ll find that SDL 2.0 seems to have cleaned up
some of this. It now has the expectation that you’re actually going
to use a something sane and even has routines tell you if a surface
(or texture) is actually in a sane format, and which one if so. :slight_smile:

Then there are hardware and software surfaces and textures. As you
already know, video cards are 3D nowadays. And 3D drivers offload
most of the graphics processing to the graphics processor on the
video card. On modern OSes, this is done even for 2D video if
possible (Windows Vista/7/8, Mac OS X 10.3+, newer window managers
and desktop environments under *nix???)

To make this work as well as it can, you want to cache as much of
your graphics data (textures) in the video card’s memory as possible
because it can draw to the frame buffer faster than you can, usually
without taking up valuable time on the system bus to do it. You can
obviously update the contents of any texture, but your renderer is
running a 3D backend and that backend supports a special mode for
streaming textures, it gets faster than using generic code involving
glTexSubImage2D or equivalent.

Knowing that should help you make sense of SDL’s renderer which
doesn’t seem terribly complex until you get into the backends. And
if needed, you cal always look at the software backend for code you
should be able to follow.

Hope that gives you a place to start. :slight_smile:

Joseph


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Not sure if you’ve really gotten my point buddy. Nevermind.
There’s always differences in functions, that don’t get documented unless the designer actually makes a point of documenting them. There’s no point for the actual coders to go through the .h and document it, because they might get it wrong. Adding example code, that’s something a programmer can do. Actually documenting it, no, the programmers can’t do that accurately.
Anyway. There’s no wiki for SDL_image, so it can’t be documented by external programmers anyway-
M

(differentiating ‘coders’/‘programmers’ as users of the api, as opposed to from the api coders ('designers))