Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to share data between python processes without writing to disk

Helllo, I would like to share small amounts of data (< 1K) between python and processes. The data is physical pc/104 IO data which changes rapidly and often (24x7x365). There will be a single "server" writing the data and multiple clients reading portions of it. The system this will run on uses flash memory (CF card) rather than a hard drive, so I'm worried about wearing out the flash memory with a file based scheme. I'd also like to use less power (processor time) as we are 100% solar powered.

  • Is this a valid worry? We could possibly change the CF card to a SSD.
  • Does changing a value using mmap physically write the data to disk or is this a virtual file?
  • We will be running on Debian so perhaps the POSIX IPC for python module is the best solution. Has anyone used it?
  • Has anyone tried the Python Object Sharing (POSH) module? It looks promising at first glance but it is in "Alpha" and doesn't seem to be actively being developed.

Thank You

UPDATE: We slowed down the maximum data update rate to about 10 Hz, but more typically 1 Hz. Clients will only be notified when a value changes rather than at a constant update rate. We have gone to a multiple servers/multiple clients model where each server specializes in a certain type of instrument or function. Since it turned out that most of the programming was going to be done by Java programmers, we ended up using JSON-RPC over TCP. The servers wil be written in Java but I still hope to write the main client in Python and am investigation JSON-RPC implementations.

like image 375
RyanN Avatar asked Jan 05 '10 14:01

RyanN


People also ask

Can two processes share memory?

Processes don't share memory with other processes. Threads share memory with other threads of the same process.

What is shared memory python?

Shared memory can be a very efficient way of handling data in a program that uses concurrency. Python's mmap uses shared memory to efficiently share large amounts of data between multiple Python processes, threads, and tasks that are happening concurrently.


1 Answers

An alternative to writing the data to file in the server process might be to directly write to the client processes:

Use UNIX domain sockets (or TCP/IP sockets if the clients run on different machines) to connect each client to the server, and have the server write into those sockets. Depending on your particular processing model, choosing a client/socket may be done by the server (e.g. round-robin) or by the clients signalling that they're ready for more.

like image 98
digitalarbeiter Avatar answered Nov 07 '22 01:11

digitalarbeiter