I need a timer to be triggered every 1 ms. This document reports, that slot invocation may be much slower than even a virtual function call.
But if we compare signals/slots with event occurance, what mechanism will be faster, more efficient and produce less overhead: QTimer
with it's timeout()
signal connected to a slot or bare QObject::startTimer()
\ QObject::killTimer()
with QObject::timerEvent()
?
Will the answer for above question be the same for Windows and Linux?
QTimer
is actually just a signal-slot wrapper around the QObject::startTimer()
functionality, so it will undoubtedly have more overhead associated with it on all platforms (it internally implements QObject::timerEvent()
-- its implementation of this function is just to emit the timeout()
signal).
Of note, QBasicTimer
is a more light-weight wrapper around the QObject::startTimer()
functionality. If you use QBasicTimer
, you still have to implement QObject::timerEvent()
, but it manages the timer ID for you. As such, a QBasicTimer
combines some of the ease-of-use of a QTimer
with the efficiency of using the QObject::startTimer()
mechanism.
As a matter of fact, if you need precision, QT does not guaranteed that your timer gets executed exactly after 1ms.
At least until QT 4.7.X if QT has an event (processed inside the event loop) all timers are checked for expiration internally (and then raised their signals) "in the event loop". I mean, they won't be executed as a OS event that interrupts other tasks, etc.
What you could get is a timer executed after 1.5 secs if any other 3 events in your loop need for example 0.5 sec each.
I hope that my memory is not failing, I took a look at QT timers code some months ago and now I can't remember if timer events are processed after other events or before.
Hope this helps you a bit more.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With