Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Real-time operating via Python

So I am an inexperienced Python coder, with what I have gathered might be a rather complicated need. I am a cognitive scientist and I need precise stimulus display and button press detection. I have been told that the best way to do this is by using real-time operating, but have no idea how to go about this. Ideally, with each trial, the program would operate in real-time, and then once the trial is over, the OS can go back to not being as meticulous. There would be around 56 trials. Might there be a way to code this from my python script?

(Then again, all I need to know is when a stimulus is actually displayed. The real-time method would assure me that the stimulus is displayed when I want it to be, a top-down approach. On the other hand, I could take a more bottom-up approach if it is easier to just know to record when the computer actually got a chance to display it.)

like image 354
Jenna Zeigen Avatar asked Aug 16 '11 14:08

Jenna Zeigen


People also ask

What is Python in real world?

Python is often used as a support language for software developers, for build control and management, testing, and in many other ways. SCons for build control. Buildbot and Apache Gump for automated continuous compilation and testing. Roundup or Trac for bug tracking and project management.

How do you implement real-time in python?

Python itself isn't real time. National Instruments sells a setup that allows you to program in real-time in a very easy-to-use programming language called LabviewRT. LabviewRT pushing your code into an FPGA daughter card that operates in real time.

Can Python be used for real-time?

Python allows developers to develop a real-time application such as Games, Web applications, Enterprise applications, Scientific and computational applications, Image processing and graphic design applications, GUI based desktop applications, etc.


1 Answers

When people talk about real-time computing, what they mean is that the latency from an interrupt (most commonly set off by a timer) to application code handling that interrupt being run, is both small and predictable. This then means that a control process can be run repeatedly at very precise time intervals or, as in your case, external events can be timed very precisely. The variation in latency is usually called "jitter" - 1ms maximum jitter means that an interrupt arriving repeatedly will have a response latency that varies by at most 1ms.

"Small" and "predictable" are both relative terms and when people talk about real-time performance they might mean 1μs maximum jitter (people building inverters for power transmission care about this sort of performance, for instance) or they might mean a couple of milliseconds maximum jitter. It all depends on the requirements of the application.

At any rate, Python is not likely to be the right tool for this job, for a few reasons:

  • Python runs mostly on desktop operating systems. Desktop operating systems impose a lower limit on the maximum jitter; in the case of Windows, it is several seconds. Multiple-second events don't happen very often, every day or two, and you'd be unlucky to have one coincide with the thing you're trying to measure, but sooner or later it will happen; jitter in the several-hundred-milliseconds region happens more often, perhaps every hour, and jitter in the tens-of-milliseconds region is fairly frequent. The numbers for desktop Linux are probably similar, though you can apply different compile-time options and patch sets to the Linux kernel to improve the situation - Google PREEMPT_RT_FULL.
  • Python's stop-the-world garbage collector makes latency non-deterministic. When Python decides it needs to run the garbage collector, your program gets stopped until it finishes. You may be able to avoid this through careful memory management and carefully setting the garbage collector parameters, but depending on what libraries you are using, you may not, too.
  • Other features of Python's memory management make deterministic latency difficult. Most real-time systems avoid heap allocation (ie C's malloc or C++'s new) because the amount of time they take is not predictable. Python neatly hides this from you, making it very difficult to control latency. Again, using lots of those nice off-the-shelf libraries only makes the situation worse.
  • In the same vein, it is essential that real-time processes have all their memory kept in physical RAM and not paged out to swap. There is no good way of controlling this in Python, especially running on Windows (on Linux you might be able to fit a call to mlockall in somewhere, but any new allocation will upset things).

I have a more basic question though. You don't say whether your button is a physical button or one on the screen. If it's one on the screen, the operating system will impose an unpredictable amount of latency between the physical mouse button press and the event arriving at your Python application. How will you account for this? Without a more accurate way of measuring it, how will you even know whether it is there?

like image 89
Tom Avatar answered Oct 17 '22 07:10

Tom