- Correct me if I’m mistaken, but in the case of SDL_OPENGLBLIT-using code,
when SDL_GL_UpdateRects() is called, no scaling is happening, right?
If that is true, then why not use GL_NEAREST filter modes in SDL_GL_Lock()
so that software renderers don’t choke?
It wasn’t meant for use with software rendering. And if you want you can
simply call SDL_GL_Lock yourself, change the filtering to whatever you
like and then call SDL_UpdateRects.
Okay, saw the other reply to this
Is there a way to submit changes like this via CVS?
- For Win32 at least, the 16-bit OPENGLBLIT shadow surface allocation
should be
commented out using #ifdef GL_UNSIGNED_SHORT_5_6_5 (or whatever it is);
you can see that it causes testgl to mess up the little face, since the
Windows
OpenGL headers are 1.1 only. Some sort of run-time OpenGL version check
might be worthwhile.
The #ifdef GL_… is broken and cannot be used - e.g. Mesa uses enums and
no #ifdefs. Maybe we should check for GL_VERSION being defined but the
standard
doesn’t say it has to be there… BTW, you should update your OpenGL
headers - OpenGL 1.1 is very old.
Right, I agree that the #ifdef stuff is suspect, but since it was being
used that way elsewhere, I figured that was the current method.
I also agree that OpenGL 1.1 is very old, but that happens to be the
version that the Microsoft software renderer implements for Win9x. I
don’t even think the SGI Win32 version is 1.2 (although it does support
565 textures), and that requires dynamically loading
Choose/Descrribe/SetPixelFormat, which SDL currently doesn’t do. Mesa is
for all intents and purposes 1.2, but finding a Win32 version with the MMX
optimizations (or building it) is a nontrivial undertaking.
- SDL_FillRect() is pretty useless as-is with OPENGLBLIT, since it
draws pixels
which will be transparent upon GL blit. I have worked around this.
That won’t happen in 16 bit mode
Um, correct me if I’m wrong, but I think it will unless the texture
surface is 16-bit, which won’t happen on Win32 since the software renderer
is version 1.1 and doesn’t support 565 textures blah blah blah
Sorry to keep whining about Winblows, but since it is the majority platform
right now, it seems prudent to keep the Win32 stuff working.
Anyway, the workaround is to do the following:
replace SDL_FillRect(surface,&rect,SDL_MapRGB(surface->format,r,g,b));
with SDL_FillRect(surface,&rect,SDL_MapRGB(surface->format,r,g,b) |
surface->format->Amask);
which, if proper, could be added to the SDL_MapRGB code.
I’ve never really been sure what the destination alpha blending behavior is
supposed to be for SDL.
- SDL_BlitSurface() also seems to have some trouble with the final alpha
values;
this is probably due to hardware (or assembler) acceleration. I have
worked around this,
but if it’s a problem in a software blitter, it would be drastically
better
to change the blitter.
I guess it is a blitter problem. Usually you want to use the 16 bit
surface and won’t hit those problems.
Unless you’re using the SDL surface as a way to do HUD-like display, and
therefore you actually want to be able to specify transparent
regions. Which would be rather desirable.
FWIW, my current workaround is to force the alpha values of the affected
pixels to opaque,
which is arguably bad behavior, and also very sub-optimal. Maybe blitters
with another blend mode?
– Mike>On Fri, Jul 21, 2000 at 04:48:55PM -0400, Michael Chen wrote: