Now that this 16bit bug has been fixed, I was able to do a benchmark on my
game in 32 vs. 16 bit. (K6-III, TNT, W2K)
The results are such that I simply don’t believe them:
normal scene, with some colourkey blits and one big blit:
32 bit: 22 ms
16 bit: 6 ms
Just blitting 200*100 frames on a surface:
32 bit: 29 ms per 100 frames
16 bit: 2 ms per 100 frames
Either there’s a bug in the SDL_GetTicks() function, or Nvidia does really
crappy drivers for W2K, or you did really good optimization work on the 16
bit blits.
If it’s the latter, I’d request the same on the 32 bit ones 
Anyway, this rocks…
Wish you a nice week,
Pius II.
How many bits is your currently display set for? 16 or 32?–
Olivier A. Dagenais - Software Architect and Developer
“Pius II.” wrote in message
news:000901c099dd$d76b9cf0$01000001 at wildthing…
Now that this 16bit bug has been fixed, I was able to do a benchmark on my
game in 32 vs. 16 bit. (K6-III, TNT, W2K)
The results are such that I simply don’t believe them:
normal scene, with some colourkey blits and one big blit:
32 bit: 22 ms
16 bit: 6 ms
Just blitting 200*100 frames on a surface:
32 bit: 29 ms per 100 frames
16 bit: 2 ms per 100 frames
Either there’s a bug in the SDL_GetTicks() function, or Nvidia does
really
crappy drivers for W2K, or you did really good optimization work on the
16
bit blits.
If it’s the latter, I’d request the same on the 32 bit ones 
Anyway, this rocks…
Wish you a nice week,
Pius II.
How many bits is your currently display set for? 16 or 32?
Windowed mode.
I use the depth that is already running, and switched the GUI display depth.
The 16 bit blit with 2 ms is not repeatable, now it’s around 8 ms, which is
extreme anyway.
Someday I’ll even look up this “anyway” and find out how to use it, and if
it’s anyway or anyways 
Also, if you are using SDL timer functions to benchmark your code, you may
want to know that they are only accurate to 10 ms, I think. I believe
that’s what the documentation said…–
Olivier A. Dagenais - Software Architect and Developer
“Pius II.” wrote in message
news:000b01c099e8$fb960830$01000001 at wildthing…
How many bits is your currently display set for? 16 or 32?
Windowed mode.
I use the depth that is already running, and switched the GUI display
depth.
The 16 bit blit with 2 ms is not repeatable, now it’s around 8 ms, which
is
extreme anyway.
Someday I’ll even look up this “anyway” and find out how to use it, and if
it’s anyway or anyways 
The SDL timer function (I’m assuming) uses the GetSystemTime command. I
have actually just went through writing a more accurate timer for a project
I am working on (where it needed to be precise to at least the ms… and I
actually got it much more accurate than that, with a fallback to the
standard call if neccessary… I could post the code if anyone cares.)
The specs on the function is actually system dependent, but I’m assuming
everyone is using a decent computer, so the answer is actually somewhere
between 10 and 11 ms. My function was actually accurate to something like
10^-5 or -6… perhaps even more (didn’t spec it too much as it was WAY
more than we needed). It was built for compatibility w/ the VCL, but the
calls I make are actually WIN API calls. Only downfall was that mine took
almost twice as long as the other one… 6 microseconds on a PIII - 750!
SOOOO slow =)
Joe Tennies
At 01:06 PM 2/18/01 -0800, you wrote:>Also, if you are using SDL timer functions to benchmark your code, you may
want to know that they are only accurate to 10 ms, I think. I believe
that’s what the documentation said…
–
Olivier A. Dagenais - Software Architect and Developer
“Pius II.” wrote in message
news:000b01c099e8$fb960830$01000001 at wildthing…
How many bits is your currently display set for? 16 or 32?
Windowed mode.
I use the depth that is already running, and switched the GUI display
depth.
The 16 bit blit with 2 ms is not repeatable, now it’s around 8 ms, which
is
extreme anyway.
Someday I’ll even look up this “anyway” and find out how to use it, and if
it’s anyway or anyways 
The SDL timer function (I’m assuming) uses the GetSystemTime command. I
have actually just went through writing a more accurate timer for a project
I am working on (where it needed to be precise to at least the ms… and I
actually got it much more accurate than that, with a fallback to the
standard call if neccessary… I could post the code if anyone cares.)
Might be a nice update to the SDL libs. 
-bill!
Also, if you are using SDL timer functions to benchmark your code, you may
want to know that they are only accurate to 10 ms, I think. I believe
that’s what the documentation said…
SDL_GetTicks() is usually accurate to the millisecond; on Unix it uses
gettimeofday() which is frequently implemented using the cycle
counters present in most modern cpus
The SDL timer functions (and SDL_Delay) depend on the scheduling granularity
of the host OS which is frequently around 10 ms