Sam Lantinga wrote:
SDL_UpdateRect{s}() does not clip (this is documented). However, this
bites people often enough that it’s probably a misfeature — I’ll add the
necessary clipping. (this would break forward-compatibility but it’s probably
worth it)
Please don’t. It will slow some games down by a couple frames per second.
If you have to do this, we can add SDL_UpdateRectsClipped(), but people
should really either be doing clipping themselves or actually using the
clipped rectangle that the blit generates.
The problem I was seeing was that the blit did clip and set either w or
h to zero, but the SDL_UpdateRects blew up on the clipped rectangle. The
first time this happened it actually hung the server. I had to ssh in
from another machine and restart it. I don’t think that UpdateRects
should clip, but I also don’t think that it should have a problem with a
rectangle that FillRect has already clipped.
My original posting was just to verify whether or not what I was seeing
was correct behavior or a bug.
Here is the test code:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include “trace.h”
#include “SDL.h”
void initOrQuit(Uint32 flags)
{
if (-1 == SDL_Init(flags))
{
printf(“Failed to initialize SDL error=%s\n”, SDL_GetError());
exit(1);
}
else
{
printf(“Initialized SDL\n”);
}
}
void testGraphics()
{
SDL_Surface *screen = NULL;
initOrQuit(SDL_INIT_VIDEO);
atexit(SDL_Quit);
//screen = SDL_SetVideoMode(640, 480, 0, (SDL_ANYFORMAT |
SDL_FULLSCREEN));
screen = SDL_SetVideoMode(640, 480, 0, (SDL_ANYFORMAT));
if (NULL == screen)
{
printf(“Can’t set video mode\n”);
exit(1);
}
SDL_PixelFormat *pf = screen->format;
SDL_Rect save;
SDL_Rect area;
int frames = 0;
Uint32 start = SDL_GetTicks();
Uint32 duration = 10 * 1000;
while (SDL_GetTicks() < (start + duration))
{
area.w = screen->w / 20;
area.h = screen->h / 20;
area.x += 10;
area.y += 10;
Uint8 r = random() & 0xff;
Uint8 g = random() & 0xff;
Uint8 b = random() & 0xff;
Uint32 color = SDL_MapRGB(pf, r, g, b);
save = area;
SDL_FillRect(screen, &area, color);
if ((area.x != save.x) ||
(area.y != save.y) ||
(area.w != save.w) ||
(area.h != save.h))
{
printf("save x= %d y=%d w=%d h=%d\n", save.x, save.y, save.w,
save.h);
printf(“area x= %d y=%d w=%d h=%d\n”, area.x, area.y, area.w,
area.h);
}
SDL_UpdateRects(screen, 1, &area);
frames++;
}
Uint32 elapsed = SDL_GetTicks() - start;
printf(“elapsed=%u frames=%d fps=%f\n”,
elapsed, frames, (((float)frames) / ((float)elapsed)) * 1000);
//SDL_Delay(5*1000);
}
int main(int argc, char **argv)
{
testGraphics();
}
Yes, I know it isn’t doing anything reasonable. I’ve been going through
the SDL API testing key pieces and comparing the observed behavior to
the documented behavior. It’s the way I learn any new API. I usually
find a lot of problems in an API this way… SDL seems to be very
robust.
Bob Pendleton
P.S.
I was very pleasantly surprised to find that I could create 10,000
timers, each with a unique time delay, and have them all work correctly.>
See ya,
-Sam Lantinga, Software Engineer, Blizzard Entertainment
SDL mailing list
SDL at libsdl.org
http://www.libsdl.org/mailman/listinfo/sdl
–
±-----------------------------------+
- Bob Pendleton is seeking contract +
- and consulting work. Find out more +
- at http://www.jump.net/~bobp +
±-----------------------------------+