I'm definitely a bit lost with the new C++ chrono library.
Here I have an update loop. It runs two operations:
engine.Update()
engine.Render()
These are long operations, and it's hard to tell how long they are.
Thus, we measure how long they took, then do some calculations and figure the best way to gradually call update before we call render.
To do this, i'm using C++11's Chrono functionality. I chose it because it sounded like a good deal: More accurate, More platform dependent. I'm finding i'm hitting more problems than now now though.
Following is my code, as well as my primary problem. Any help on either the problem, or a proper way to do my operations, is greatly needed!
I marked my questions in comments directly next to the lines in question, which i'll re-iterate below.
The header file:
class MyClass
{
private:
typedef std::chrono::high_resolution_clock Clock;
Clock::time_point mLastEndTime;
milliseconds mDeltaTime;
}
The simplified update loop
// time it took last loop
milliseconds frameTime;
// The highest we'll let that time go. 60 fps = 1/60, and in milliseconds, * 1000
const milliseconds kMaxDeltatime((int)((1.0f / 60.0f) * 1000.0f)); // It's hard to tell, but this seems to come out to some tiny number, not what I expected!
while (true)
{
// How long did the last update take?
frameTime = duration_cast<milliseconds>(Clock::now() - mLastEndTime); // Is this the best way to get the delta time, with a duration cast?
// Mark the last update time
mLastEndTime = Clock::now();
// Don't update everything with the frameTime, keep it below our maximum fps.
while (frameTime.count() > 0) // Is this the best way to measure greater than 0 milliseconds?
{
// Determine the minimum time. Our frametime, or the max delta time?
mDeltaTime = min(frameTime, kMaxDeltatime);
// Update our engine.
engine->Update((long)mDeltaTime.count()); // From here, it's so much easier to deal with code in longs. Is this the best way to shove a long through my code?
// Subtract the delta time out of the total update time
frameTime -= mDeltaTime;
}
engine->Render();
}
The main question is:
My mDeltaTime always comes out tiny. It's basically stuck in an almost-infinite loop. This is because the kMaxDeltatime is super small, but if i'm targeting 60 Frames per Second, didn't I calculate the correct milliseconds?
Here are all the questions listed from above:
const milliseconds kMaxDeltatime((int)((1.0f / 60.0f) * 1000.0f)); // It's hard to tell, but this seems to come out to some tiny number, not what I expected!
frameTime = duration_cast<milliseconds>(Clock::now() - mLastEndTime); // Is this the best way to get the delta time, with a duration cast?
while (frameTime.count() > 0) // Is this the best way to measure greater than 0 milliseconds?
engine->Update((long)mDeltaTime.count()); // From here, it's so much easier to deal with code in longs. Is this the best way to shove a long through my code?
I'm sorry for the confusion guys. I feel like an idiot with this chrono library. Most of the help sites, or reference material, or even the direct code itself is very confusing to read and understand to what i'm applying it to. Pointers into how I should be searching for solutions or code are very much welcomed!
EDIT:
Joachim pointed out that std::min/max works just fine for milliseconds! Updated code to reflect change.
See Question&Answers more detail:
os