Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
267 views
in Technique[技术] by (71.8m points)

c++ - Sleeping for an exact duration

My understanding of the Sleep function is that it follows "at least semantics" i.e. sleep(5) will guarantee that the thread sleeps for 5 seconds, but it may remain blocked for more than 5 seconds depending on other factors. Is there a way to sleep for exactly a specified time period (without busy waiting).

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

As others have said, you really need to use a real-time OS to try and achieve this. Precise software timing is quite tricky.

However... although not perfect, you can get a LOT better results than "normal" by simply boosting the priority of the process that needs better timing. In Windows you can achieve this with the SetPriorityClass function. If you set the priority to the highest level (REALTIME_PRIORITY_CLASS: 0x00000100) you'll get much better timing results. Again - this will not be perfect like you are asking for, though.

This is also likely possible on other platforms than Windows, but I've never had reason to do it so haven't tested it.

EDIT: As per the comment by Andy T, if your app is multi-threaded you also need to watch out for the priority assigned to the threads. For Windows this is documented here.


Some background...

A while back I used SetPriorityClass to boost the priority on an application where I was doing real-time analysis of high-speed video and I could NOT miss a frame. Frames were arriving to the pc at a very regular (driven by external framegrabber HW) frequency of 300 frames per second (fps), which fired a HW interrupt on every frame which I then serviced. Since timing was very important, I collected a lot of stats on the interrupt timing (using QueryPerformanceCounter stuff) to see how bad the situation really was, and was appalled at the resulting distributions. I don't have the stats handy, but basically Windows was servicing the interrupt whenever it felt like it when run at normal priority. The histograms were very messy, with the stdev being wider than my ~3ms period. Frequently I would have gigantic gaps of 200 ms or greater in the interrupt servicing (recall that the interrupt fired roughly every 3 ms)!! ie: HW interrupts are FAR from exact! You're stuck with what the OS decides to do for you.

However - when I discovered the REALTIME_PRIORITY_CLASS setting and benchmarked with that priority, it was significantly better and the service interval distribution was extremely tight. I could run 10 minutes of 300 fps and not miss a single frame. Measured interrupt servicing periods were pretty much exactly 1/300 s with a tight distribution.

Also - try and minimize the other things the OS is doing to help improve the odds of your timing working better in the app where it matters. eg: no background video transcoding or disk de-fragging or anything while your trying to get precision timing with other code!!

In summary:

  1. If you really need this, go with a real time OS
  2. If you can't use a real-time OS (impossible or impractical), boosting your process priority will likely improve your timing by a lot, as it did for me
  3. HW interrupts won't do it... the OS still needs to decide to service them!
  4. Make sure that you don't have a lot of other processes running that are competing for OS attention
  5. If timing is really important to you, do some testing. Although getting code to run exactly when you want it to is not very easy, measuring this deviation is quite easy. The high performance counters in PCs (what you get with QueryPerformanceCounter) are extremely good.

Since it may be helpful (although a bit off topic), here's a small class I wrote a long time ago for using the high performance counters on a Windows machine. It may be useful for your testing:

CHiResTimer.h

#pragma once
#include "stdafx.h"
#include <windows.h>

class CHiResTimer
{
private:
    LARGE_INTEGER frequency;
    LARGE_INTEGER startCounts;
    double ConvertCountsToSeconds(LONGLONG Counts);
public:
    CHiResTimer(); // constructor
    void ResetTimer(void);
    double GetElapsedTime_s(void);
};

CHiResTimer.cpp

#include "stdafx.h"
#include "CHiResTimer.h"

double CHiResTimer::ConvertCountsToSeconds(LONGLONG Counts)
{
    return ((double)Counts / (double)frequency.QuadPart) ;
}

CHiResTimer::CHiResTimer()
{
    QueryPerformanceFrequency(&frequency);
    QueryPerformanceCounter(&startCounts); // starts the timer right away
}

void CHiResTimer::ResetTimer()
{
    QueryPerformanceCounter(&startCounts); // reset the reference counter
}

double CHiResTimer::GetElapsedTime_s()
{
    LARGE_INTEGER countsNow;
    QueryPerformanceCounter(&countsNow);
    return ConvertCountsToSeconds(countsNow.QuadPart - startCounts.QuadPart);
}

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...