Setting FPS to 60?

Hello everyone!

I’m translating a piece of code into C#. However it has some variables
and functions that who-knows-where they are declared, so if anyone could
analyze my framerate function I would appreciate it: (translated back to
C++ and regular SDL 'cause I know most of you guys don’t speak C#)

void waitVblankStart()
{
float t = ((float) SDL_GetTicks()) / ((float) 60);
if (t - lastTime < 1.0/60.0)
{
SDL_Delay(int)(1000000000.0 * (1.0 / 60.0 - (t - lastTime))));
}
// lastTime is a global, and 2,000,000 stands for 2.0GHz How do I get
the actual processor cycles-per-second in SDL?
lastTime = ((float) SdlDotNet.Core.Timer.TicksElapsed) / ((float) 2000000);

printf(lastTime); // assume printf() can print floats

}

Thanks in advance!

SDL_gfx has a framerate manager that does stuff like that …
http://www.ferzkopp.net/joomla/content/view/19/14/

—snip—
The framerate functions are used to insert delays into the graphics loop
to maintain a constant framerate.

The implementation is more sophisticated that the usual
SDL_Delay(1000/FPS);
call since these functions keep track of the desired game time per frame
for a linearly interpolated sequence of future timing points of each
frame. This is done to avoid rounding errors from the inherent
instability in the delay generation and application - i.e. the 100th
frame of a game running at 50Hz will be accurately 2.00sec after the 1st
frame (if the machine can keep up with the drawing). See also the
diagram for more details on this.

[[[ Interface ]]]

The functions return 0 or value for sucess and -1 for error. All functions
use a pointer to a framerate-manager variable to operate.

void SDL_initFramerate(FPSmanager * manager);
Initialize the framerate manager, set default framerate of 30Hz and
reset delay interpolation.

int SDL_setFramerate(FPSmanager * manager, int rate);
Set a new framerate for the manager and reset delay interpolation.

int SDL_getFramerate(FPSmanager * manager);
Get the currently set framerate of the manager.

void SDL_framerateDelay(FPSmanager * manager);
Generate a delay to accommodate currently set framerate. Call once in the
graphics/rendering loop. If the computer cannot keep up with the rate
(i.e.
drawing too slow), the delay is zero and the delay interpolation is reset.

L-28C wrote:> Hello everyone!

I’m translating a piece of code into C#. However it has some variables
and functions that who-knows-where they are declared, so if anyone could
analyze my framerate function I would appreciate it: (translated back to
C++ and regular SDL 'cause I know most of you guys don’t speak C#)

void waitVblankStart()
{
float t = ((float) SDL_GetTicks()) / ((float) 60);
if (t - lastTime < 1.0/60.0)
{
SDL_Delay(int)(1000000000.0 * (1.0 / 60.0 - (t - lastTime))));
}
// lastTime is a global, and 2,000,000 stands for 2.0GHz How do I get
the actual processor cycles-per-second in SDL?
lastTime = ((float) SdlDotNet.Core.Timer.TicksElapsed) / ((float) 2000000);

printf(lastTime); // assume printf() can print floats
}

Thanks in advance!


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

That would work, in fact C#'s SDL has something very similar… However
I can’t use that due to the nature of my program.

I must do it manually.

Can anyone help me with that?

Thanks!

Andreas Schiffler wrote:> SDL_gfx has a framerate manager that does stuff like that …

http://www.ferzkopp.net/joomla/content/view/19/14/

—snip—
The framerate functions are used to insert delays into the graphics loop
to maintain a constant framerate.

The implementation is more sophisticated that the usual
SDL_Delay(1000/FPS);
call since these functions keep track of the desired game time per frame
for a linearly interpolated sequence of future timing points of each
frame. This is done to avoid rounding errors from the inherent
instability in the delay generation and application - i.e. the 100th
frame of a game running at 50Hz will be accurately 2.00sec after the 1st
frame (if the machine can keep up with the drawing). See also the
diagram for more details on this.

[[[ Interface ]]]

The functions return 0 or value for sucess and -1 for error. All functions
use a pointer to a framerate-manager variable to operate.

void SDL_initFramerate(FPSmanager * manager);
Initialize the framerate manager, set default framerate of 30Hz and
reset delay interpolation.

int SDL_setFramerate(FPSmanager * manager, int rate);
Set a new framerate for the manager and reset delay interpolation.

int SDL_getFramerate(FPSmanager * manager);
Get the currently set framerate of the manager.

void SDL_framerateDelay(FPSmanager * manager);
Generate a delay to accommodate currently set framerate. Call once in the
graphics/rendering loop. If the computer cannot keep up with the rate
(i.e.
drawing too slow), the delay is zero and the delay interpolation is reset.

L-28C wrote:

Hello everyone!

I’m translating a piece of code into C#. However it has some variables
and functions that who-knows-where they are declared, so if anyone could
analyze my framerate function I would appreciate it: (translated back to
C++ and regular SDL 'cause I know most of you guys don’t speak C#)

void waitVblankStart()
{
float t = ((float) SDL_GetTicks()) / ((float) 60);
if (t - lastTime < 1.0/60.0)
{
SDL_Delay(int)(1000000000.0 * (1.0 / 60.0 - (t - lastTime))));
}
// lastTime is a global, and 2,000,000 stands for 2.0GHz How do I get
the actual processor cycles-per-second in SDL?
lastTime = ((float) SdlDotNet.Core.Timer.TicksElapsed) / ((float) 2000000);

printf(lastTime); // assume printf() can print floats
}

Thanks in advance!


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Nevermind, I found a better approach.

Thanks though.On 3/19/07, L-28C wrote:

Hello everyone!

I’m translating a piece of code into C#. However it has some variables
and functions that who-knows-where they are declared, so if anyone could
analyze my framerate function I would appreciate it: (translated back to
C++ and regular SDL 'cause I know most of you guys don’t speak C#)

void waitVblankStart()
{
float t = ((float) SDL_GetTicks()) / ((float) 60);
if (t - lastTime < 1.0/60.0)
{
SDL_Delay(int)(1000000000.0 * (1.0 / 60.0 - (t -
lastTime))));
}
// lastTime is a global, and 2,000,000 stands for 2.0GHz How do I
get
the actual processor cycles-per-second in SDL?
lastTime = ((float) SdlDotNet.Core.Timer.TicksElapsed) / ((float)
2000000);

   printf(lastTime); // assume printf() can print floats

}

Thanks in advance!


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Do you mind sharing your better approach with the list, so that those of us looking for similar solutions in the future can benefit from your experimenting/research/experience, etc.?
Thanks!
-Dave O.----- Original Message -----
From: Leo Cabrera
To: A list for developers using the SDL library. (includes SDL-announce)
Sent: Tuesday, March 20, 2007 9:48 AM
Subject: Re: [SDL] Setting FPS to 60?

Nevermind, I found a better approach.

Thanks though.

On 3/19/07, L-28C wrote:
Hello everyone!

I'm translating a piece of code into C#.  However it has some variables
and functions that who-knows-where they are declared, so if anyone could 
analyze my framerate function I would appreciate it: (translated back to
C++ and regular SDL 'cause I know most of you guys don't speak C#)

void waitVblankStart()
{
       float t = ((float) SDL_GetTicks()) / ((float) 60); 
       if (t - lastTime < 1.0/60.0)
       {
               SDL_Delay(int)(1000000000.0 * (1.0 / 60.0 - (t - lastTime))));
       }
       // lastTime is a global, and 2,000,000 stands for 2.0GHz How do I get 
the actual processor cycles-per-second in SDL?
       lastTime = ((float) SdlDotNet.Core.Timer.TicksElapsed) / ((float) 2000000);

       printf(lastTime); // assume printf() can print floats
}

Thanks in advance! 

_______________________________________________
SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org 


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Another approach would be to use timers with little code. Ie:

Add a timer: let it increment variable “think”

Fer frame:
while (think > 0) {
doThink();
think–;
}

And there you have it. No need to cap your FPS and your program will
always run fps-independedly.

I am also interested in the ‘better way’ , btw. :wink:

Well, I couldn’t find a way of setting the FPS, I meant I found a
solution to the original problem that caused me to think setting a
static framerate was a solution. It’s program specific, not C++, not
SDL-related, so I will not post to avoid people flaming me for posting
off-topic.

But if this other solution doesn’t work in the end (very possible), and
I have to recur to setting the FPS again, and I find a solution that
works, I would gladly help the people who have helped me so many times
before.

David Olsen wrote:> Do you mind sharing your better approach with the list, so that those of

us looking for similar solutions in the future can benefit from your
experimenting/research/experience, etc.?
Thanks!
-Dave O.

----- Original Message -----
*From:* Leo Cabrera <mailto:leo28c at gmail.com>
*To:* A list for developers using the SDL library. (includes
SDL-announce) <mailto:sdl at lists.libsdl.org>
*Sent:* Tuesday, March 20, 2007 9:48 AM
*Subject:* Re: [SDL] Setting FPS to 60?

Nevermind, I found a better approach.
 
Thanks though.

 
On 3/19/07, *L-28C* <@Leo_M_Cabrera <mailto:@Leo_M_Cabrera>> wrote:

    Hello everyone!

    I'm translating a piece of code into C#.  However it has some
    variables
    and functions that who-knows-where they are declared, so if
    anyone could
    analyze my framerate function I would appreciate it: (translated
    back to
    C++ and regular SDL 'cause I know most of you guys don't speak C#)

    void waitVblankStart()
    {
           float t = ((float) SDL_GetTicks()) / ((float) 60);
           if (t - lastTime < 1.0/60.0)
           {
                   SDL_Delay(int)(1000000000.0 * (1.0 / 60.0 - (t -
    lastTime))));
           }
           // lastTime is a global, and 2,000,000 stands for 2.0GHz
    How do I get
    the actual processor cycles-per-second in SDL?
           lastTime = ((float) SdlDotNet.Core.Timer.TicksElapsed) /
    ((float) 2000000);

           printf(lastTime); // assume printf() can print floats
    }

    Thanks in advance!

    _______________________________________________
    SDL mailing list
    SDL at lists.libsdl.org <mailto:SDL at lists.libsdl.org>
    http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
    <http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org>


------------------------------------------------------------------------

_______________________________________________
SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


SDL mailing list
SDL at lists.libsdl.org
http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

Hi,

L-28C wrote:

Well, I couldn’t find a way of setting the FPS, I meant I found a
solution to the original problem that caused me to think setting a
static framerate was a solution. It’s program specific, not C++, not
SDL-related, so I will not post to avoid people flaming me for posting
off-topic.

But if this other solution doesn’t work in the end (very possible), and
I have to recur to setting the FPS again, and I find a solution that
works, I would gladly help the people who have helped me so many times
before.

This topic has been repeated ad nausem on this list. The most
flexible way is to make it framerate-independent. Search the
archives for the solutions.> David Olsen wrote:

Do you mind sharing your better approach with the list, so that those of
us looking for similar solutions in the future can benefit from your
experimenting/research/experience, etc.?
Thanks!
-Dave O.
[snip]


Cheers,
Kein-Hong Man (esq.)
Kuala Lumpur, Malaysia

Hi Guys
There appears to be some confusion about what you want to achieve,
from what your saying it sounds like you have a problem with your game being
jerky or going out of time. if this is the case you already have the
solution.

Set a timer for the frame rate you want things to move/animate, in that
timer do the logic/game movements/animations.
(you could have one timer for player movement/animation and another for
background animation running at different speeds)

In your main loop just do the update screen/draw everything so that if its
running on a slow machine everything will move at the same speed, but
the screen will update slowly, if its on a fast machine again it will run at
the same speed and it will be as smooth as silk. a word of warning you can
not poll keys and get events in your timer so do them in your main loop and
set flags or something to pass them to your timer. i tried to get key events
in a timer and i think its because the timer is an event in its self (unless
someone knows different).

OH Try not to the “while(1) think loop” thing i read in one of your usual
posts as it will be burning all your processor time. SDL_Delay i believe
does not simply loop it returns the processor time back to the OS, when i
used one at the end of main loop the processor time the game was
using dropped significantly.

I hope this helps

Trish xx