Ack. After another recompile, this stopped working, and “g” fixed it.
I believe I screwed up when testing it; I’ve had to compile and copy the
DLL to a working directory and test it, and I might not have copied it
when my binary search on these options went from “agp” to “a”.
It makes far more sense, though; it’s just a standard optimizer bug, not
turning on the strange “no aliasing” optimization. The attached patch
fixes it, but it turns off “global optimizations” for all of blit_N,
which may be overkill (it may only be needed for one or two blits, or
there may be a workaround not involving disabling optimizations), or
underkill (there may be blits in other files that also have this problem;
I certainly don’t use them all). I don’t feel like comparing optimized
assembly output right now, though.
Of course, this could still just be a blitter bug, but I can’t find one.
I’ve also changed the build to generate SDLd.{dll,lib} and SDLmaind.{dll,lib},
which is much more convenient for debugging, and made it output all four to
lib/, instead of VisualC/SDL(main)/{Debug,Release}/. Any reason not to
do that (or at least the first half) upstream?On Tue, Oct 22, 2002 at 11:54:14PM -0400, Glenn Maynard wrote:
I’m getting corrupted blits in VC7 with optimizations on in
BlitNtoNKeyCopyAlpha. It turns out that the "assume no aliasing"
optimization is getting turned on somehow. I have no idea how. (If it
was on by default, it’d cause global, catastrophic problems, so
something somewhere has to be turning it on.)
It can be disabled explicitly with:
#pragma optimize(“a”, off)
–
Glenn Maynard
-------------- next part --------------
— SDL_blit_N.c 2002-08-01 19:06:39.000000000 -0400
+++ /home/glenn/hm/stepmania/src/SDL-1.2.5/src/video/SDL_blit_N.c 2002-10-23 21:43:42.000000000 -0400
@@ -31,6 +31,10 @@
#include “SDL_video.h”
#include “SDL_blit.h”
#include “SDL_byteorder.h”
+
+#if defined(WIN32) && _MSC_VER >= 1300
+#pragma optimize(“g”, off)
+#endif
/* Function to check the CPU flags */
#define MMX_CPU 0x800000