Hi,
First post here. Nice to meet you. A problem:
I’m working on a texture loader for OpenGL in C++. Right now, a
(simplified) part where a surface is loaded looks like:
data_surf = IMG_Load(data_path.c_str());
data_surf2 = SDL_DisplayFormat(data_surf);
data_surf->format->BytesPerPixel is an unsigned char signaling 3. However,
data_surf2->format->BytesPerPixel is an unsigned char signaling 4. This is
confirmed (via shader testing) by the fact that data_surf2 ends up having 4
channels. It also appears that the (unwanted) added alpha channel values
are set to 0.
I’m confused about this behavior of SDL_DisplayFormat(…), as I thought
that was what SDL_DisplayFormatAlpha(…) is for? In any case, the extra
channel messes up the texture.
I’ve tested the image to be loaded from .jpg, .bmp, and .png (no alpha
channel) with the same results.
Ideas?
Thanks,
Ian Mallett