Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Handling an update loop using C++ Chrono?

I'm definitely a bit lost with the new C++ chrono library.

Here I have an update loop. It runs two operations:

engine.Update()
engine.Render()

These are long operations, and it's hard to tell how long they are.

Thus, we measure how long they took, then do some calculations and figure the best way to gradually call update before we call render.

To do this, i'm using C++11's Chrono functionality. I chose it because it sounded like a good deal: More accurate, More platform dependent. I'm finding i'm hitting more problems than now now though.

Following is my code, as well as my primary problem. Any help on either the problem, or a proper way to do my operations, is greatly needed!

I marked my questions in comments directly next to the lines in question, which i'll re-iterate below.

The header file:

class MyClass
{
private:
    typedef std::chrono::high_resolution_clock Clock;
    Clock::time_point mLastEndTime;
    milliseconds mDeltaTime;
}

The simplified update loop

// time it took last loop
milliseconds frameTime;
// The highest we'll let that time go. 60 fps = 1/60, and in milliseconds, * 1000
const milliseconds kMaxDeltatime((int)((1.0f / 60.0f) * 1000.0f)); // It's hard to tell, but this seems to come out to some tiny number, not what I expected!
while (true)
{
    // How long did the last update take?
    frameTime = duration_cast<milliseconds>(Clock::now() - mLastEndTime); // Is this the best way to get the delta time, with a duration cast?
    // Mark the last update time
    mLastEndTime = Clock::now();

    // Don't update everything with the frameTime, keep it below our maximum fps.
    while (frameTime.count() > 0) // Is this the best way to measure greater than 0 milliseconds?
    {
        // Determine the minimum time. Our frametime, or the max delta time?
        mDeltaTime = min(frameTime, kMaxDeltatime);

        // Update our engine.
        engine->Update((long)mDeltaTime.count()); // From here, it's so much easier to deal with code in longs. Is this the best way to shove a long through my code?

        // Subtract the delta time out of the total update time 
        frameTime -= mDeltaTime;
    }
    engine->Render();
}

The main question is: My mDeltaTime always comes out tiny. It's basically stuck in an almost-infinite loop. This is because the kMaxDeltatime is super small, but if i'm targeting 60 Frames per Second, didn't I calculate the correct milliseconds?

Here are all the questions listed from above:

const milliseconds kMaxDeltatime((int)((1.0f / 60.0f) * 1000.0f)); // It's hard to tell, but this seems to come out to some tiny number, not what I expected!

frameTime = duration_cast<milliseconds>(Clock::now() - mLastEndTime); // Is this the best way to get the delta time, with a duration cast?

while (frameTime.count() > 0) // Is this the best way to measure greater than 0 milliseconds?

engine->Update((long)mDeltaTime.count()); // From here, it's so much easier to deal with code in longs. Is this the best way to shove a long through my code?

I'm sorry for the confusion guys. I feel like an idiot with this chrono library. Most of the help sites, or reference material, or even the direct code itself is very confusing to read and understand to what i'm applying it to. Pointers into how I should be searching for solutions or code are very much welcomed!

EDIT: Joachim pointed out that std::min/max works just fine for milliseconds! Updated code to reflect change.

like image 584
MintyAnt Avatar asked Feb 09 '13 06:02

MintyAnt


1 Answers

When using std::chrono you should avoid, as much as possible, casting durations or converting durations to raw integral values. Instead you should stick with the natural durations and take advantage of the type safety that duration types provide.

Below is a series of specific recommendations. For each recommendation I'll quote lines of your original code and then show how I would re-write those lines.


const milliseconds kMaxDeltatime((int)((1.0f / 60.0f) * 1000.0f)); // It's hard to tell, but this seems to come out to some tiny number, not what I expected!

There's no reason to do this sort of computation with manual conversion constants. Instead you can do:

typedef duration<long,std::ratio<1,60>> sixtieths_of_a_sec;
constexpr auto kMaxDeltatime = sixtieths_of_a_sec{1};

frameTime = duration_cast<milliseconds>(Clock::now() - mLastEndTime); // Is this the best way to get the delta time, with a duration cast?

You can just keep the value in its native type:

auto newEndTime = Clock::now();
auto frameTime = newEndTime - mLastEndTime;
mLastEndTime = newEndTime;

while (frameTime.count() > 0) // Is this the best way to measure greater than 0 milliseconds?

Instead use:

while (frameTime > milliseconds(0))

engine->Update((long)mDeltaTime.count()); // From here, it's so much easier to deal with code in longs. Is this the best way to shove a long through my code?

It's best to write code that uses chrono::duration types throughout, rather than to use generic integral types at all, but if you really need to get a generic integral type (for example if you must pass a long to a third-party API) then you can do something like:

auto mDeltaTime = ... // some duration type

long milliseconds = std::chrono::duration_cast<std::duration<long,std::milli>>(mDeltaTime).count();
third_party_api(milliseconds);

Or:

auto milliseconds = mDeltaTime/milliseconds(1);

And to get the delta you should do something like:

typedef std::common_type<decltype(frameTime),decltype(kMaxDeltatime)>::type common_duration;
auto mDeltaTime = std::min<common_duration>(frameTime, kMaxDeltatime); 
like image 92
bames53 Avatar answered Sep 28 '22 02:09

bames53