I did goof, and my last e-mail did indeed went only to Daniel istead
of the whole list, but, long story short, I concur, this is an odd
behavior for SDL_PixelFormatEnumToMasks. I would really like to hear
from someone (Sam/Ryan?), if it’s the expected behavior or an oversight?
I’m willing to clarify the wiki on this apsect, as long as someone can
confirm one way or another.If we don’t want to break the API, let’s introduce
SDL_PixelFormatEnumToMasks2 (for the lack of better name), which does
the byte flipping according to endianess.
I think there are indeed some SDL_PIXELFORMATs that imply byte-wise
instead of Uint32-wise encoding, and are defined differently dependent
on SDL_BYTEORDER, e.g. SDL_PIXELFORMAT_RGB24
So I think we don’t need a new API, just a new
SDL_PIXELFORMAT_RGBA8888_BYTEWISE_AS_YOU_WOULD_EXPEXT or something like
that
Also: Having a pixelformat enum for bytewise RGBA would be really handy
for converting other strange (already supported) formats to RGBA for
OpenGL with SDL_ConvertPixels(), even when not using SDL_Surfaces.
BTW, those APIs are all kinda strange…
SDL_PixelFormatEnumToMasks() sets your masks for an SDL_PIXELFORMAT
enum, and then SDL_CreateRGBSurface() only uses those masks to guess the
right enum again and use that internally… (if I haven’t missed something).
So I guess
SDL_CreateRGBSurface2(flags, w, h, depth, pixelFormatEnum)
with a better name would be cool? (Same for SDL_CreateRGBSurfaceFrom())
Cheers,
DanielOn 03/20/2015 06:24 PM, Driedfruit wrote:
Although the SDL_BYTEORDER hack is not drastic or hard to implement,
and is copy-pastable from the docs, it’s still a little weird that
every SDL project has to implement it.On Fri, 20 Mar 2015 11:41:51 +0100 Daniel Gibson <@Daniel_Gibson> wrote:
I guess this was a bit long and potentially confusing, let’s try
shorter:Having byte-wise RGBA (first a red byte, then a green one, …) pixel
data is a very common usecase, as it’s what many image-decoding libs
output and what OpenGL expects as input.But according to https://wiki.libsdl.org/SDL_CreateRGBSurface and
https://wiki.libsdl.org/SDL_CreateTextureFromSurface
I’m supposed to do something like this to get bytewise RGBA pixel
data into a SDL_Surface:Uint32 rmask, gmask, bmask, amask;
#if SDL_BYTEORDER == SDL_BIG_ENDIAN
rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
#else
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;
#endifSDL_Surface *surface = SDL_CreateRGBSurface(0, 640, 480, 32,
rmask, gmask, bmask, amask);There should be an easier way to obtain those masks that does not
require the SDL user to use obscure hex numbers and worry about
endianess. SDL_PixelFormatEnumToMasks() would be a good candidate to
obtain the masks, if there were a suitable SDL_PIXELFORMAT_* for
bytewise RGBA (and not just for “a whole Uint32 contains RGBA with R
in the least significant byte, whatever that may be on your
platform”).Cheers,
DanielAm 18.03.2015 17:29, schrieb Daniel Gibson:
Let’s assume I have an array of bytes with pixel data, like
“unsigned char data*;”, e.g. from stb_image’s stbi_load().
The first byte is for red, the second for blue, the third for
green, the fourth for alpha.
I think it’s sane to call this RGBA, right?So I wanna create a SDL_Surface* with this, e.g. to set the window
icon. I do:
Uint32 rmask, gmask, bmask, amask;
int bpp;
SDL_PixelFormatEnumToMasks(SDL_PIXELFORMAT_RGBA8888,
&bpp, &rmask, &gmask, &bmask, &amask);
SDL_Surface* surf = SDL_CreateRGBSurfaceFrom((void*)data, w, h, 48,
4w, rmask, gmask, bmask, amask);(Then I display that surface).
On a little endian machine, this looks wrong, because the rmask is
0xff000000, which masks the last (fourth) byte instead of the first
one (similar for other masks).Using SDL_PIXELFORMAT_ABGR8888 looks correct, but will (most
probably) look wrong on big endian machines…
Furthermore, using it seems just wrong when I really have RGBA data.Anyway, I find it kinda surprising at first that the data passed to
SDL_CreateRGBSurfaceFrom() seem to be interpreted as 32bit ints (and
with 16bit or 24bit colordepth this is even stranger), especially as
it’s a void* pointer and not a Uint32* pointer. (Only at first
because something like this must be done internally, otherwise the
masks wouldn’t make sense)
And then it’s even more surprising that the masks generated by
SDL_PixelFormatEnumToMasks() with SDL_PIXELFORMAT_RGB888 and
SDL_PIXELFORMAT_RGBA8888 don’t seem to work correctly with
bytestreams - the very name “RGBA8888” (in contrast to the
nonexistant “RGBA32”) sounds like “there’s 8bits/1byte of red, then
1byte of green etc”, not “we really assume a 32bit integer value”.So is there a portable way to set the masks in the
(platform-specific) correct way for bytestreams?
I’d even assume that this is a more common usecase than
transforming an array of 32bit ints with red in the least
significant byte (instead of smallest address).Cheers,
Daniel
SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org