Can anybody see something obvious wrong with this code snippet, which is
meant to fade a 16-bit image according to gamma tables in rGamma, gGamma
and bGamma? It crashes (according to a printf) straight after the first
pixel access, but only when I’ve tried it with Visual C++ under Windows
200 with the ‘Maximise for speed’ option set. gcc with -O9 and Visual C++
with optimisations disabled seem to compile working code fine.
------------------8<--------------------------------
assert(SDL_LockSurface(fade) >= 0);
for (; y0 != y1; y0++)
{
register Uint16 pixptr = &((Uint16)fade->pixels)[y0 * fade->w + x0];
register int col = x0;
while (col++ < x1)
{
*pixptr = ((rGamma[((*pixptr & rm) >> rs) << rl ] >> rl ) << rs) |
((gGamma[((*pixptr & gm) >> gs) << gl ] >> gl ) << gs) |
((bGamma[((*pixptr & bm) >> bs) << bl ] >> bl ) << bs);
/* rm, rs and rl are red mask, red shift and red loss
respectively; likewise for the other colours*/
pixptr++;
}
assert(SDL_LockSurface(fade) >= 0);
I didn’t even read the entire code but I think your problem is right here.
If not in debug mode, what’s inside your assert is not called at all => your
surface is not locked !
you should do something like:
lock = SDL_LockSurface(fade);
assert(lock>= 0);
probably not the cause of your problems (??) but you’ll definitely
want to use the image pitch (fade->pitch) instead of the width
(fade->w). also, pitch is in bytes, so you’ll want to do the “casting” a
little different…
On the other hand, after testing it a little further I
realised the sound was very choppy.
this is a problem with MSVC’s optimizer mangling the code.
look back on the recent smpeg mailing list archives, there are some
sample on how/where to drop in some #pragmas that disable the global
optimizer for sections in smpeg code.
it’d be nice to get this added to the smpeg code, but apparantly that is
not going to happen without intensive debugging, investigation,
knowledgebase references, and a shrubbery.
On the other hand, after testing it a little
further I
realised the sound was very choppy.
this is a problem with MSVC’s optimizer mangling the
code.
look back on the recent smpeg mailing list archives,
there are some
sample on how/where to drop in some #pragmas that
disable the global
optimizer for sections in smpeg code.
it’d be nice to get this added to the smpeg code,
but apparantly that is
not going to happen without intensive debugging,
investigation,
knowledgebase references, and a shrubbery.
Thanx for the tip, however, I might try to play ogg
files in Windows if that’s possible instead.
Roger’s tutorial on SDL_mixer used an .ogg file in an
example but it didn’t play it on my machine when I
tried it.
I downloaded the ogg sourcefiles but haven’t had the
time to look at it yet.
if SDL_mixer is built to use the oggvorbis libraries, it can playback
ogg files as a music stream. i have a version of SDL_mixer.dll with all
the ogg libraries builtin. it can be grabbed from this zip file, http://www.pygame.org/ftp/win32-dependencies.zip
— Pete Shinners skrev: > >
Thanx for the tip, however, I might try to play
ogg
files in Windows if that’s possible instead.
Roger’s tutorial on SDL_mixer used an .ogg file in
an
example but it didn’t play it on my machine when I
tried it.
I downloaded the ogg sourcefiles but haven’t had
the
time to look at it yet.
if SDL_mixer is built to use the oggvorbis
libraries, it can playback
ogg files as a music stream. i have a version of
SDL_mixer.dll with all
the ogg libraries builtin. it can be grabbed from
this zip file, http://www.pygame.org/ftp/win32-dependencies.zip
Thanx a whole bunch!
I’ll stick to ogg files for now on definitely =).
Another question:
Why wasn’t SDL_mixer built with ogg files in the first
place or maybe they were in the latest build?_____________________________________________________
Do You Yahoo!? se.yahoo.com
Another question:
Why wasn’t SDL_mixer built with ogg files in the first
place or maybe they were in the latest build?
SDL_mixer can use ogg, but doesn’t require it. if ogg is available when
SDL_mixer is built, it will take advantage of it. this is actually a
nice flexible setup, since it allows you to create an SDL_mixer without
ogg support if it is not needed (or not available).
the binaries available from the SDL website don’t come with all the
optional dependencies built in. they mainly just stick with the 'usual’
needs of developers. perhaps after ogg reaches an official 1.0 version
we’ll see it become more standard. :]
Hi all; seeing as you were so quick off the mark last time …
Current problem is that I have some panels in my game that consist of
fairly simple red boxes with smooth fills. Under Linux and my X server
set to a 16-bit mode, they come out looking fine. Under Windows, the
smooth shading rendered in the SDL window (or in full screen) turns into
ugly stripes. I’m using IMG_load on both platforms to load from BMPs,
then converting with SDL_DisplayFormat. The same image looks fine in
Photoshop on the same Windows machine in the same screen mode.
The only difference I can tell is that when I ask for a 16-bit mode in
Windows I get a 5550 mode, and under Linux I get 5650, but surely that
shouldn’t make a difference?