If I have a python script running (with full Tkinter GUI and everything) and I want to pass the live data it is gathering (stored internally in arrays and such) to another python script, what would be the best way of doing that?
I cannot simply import script A into script B as it will create a new instance of script A, rather than accessing any variables in the already running script A.
The only way I can think of doing it is by having script A write to a file, and then script B get the data from the file. This is less than ideal however as something bad might happen if script B tries to read a file that script A is already writing in. Also I am looking for a much faster speed to communication between the two programs.
EDIT: Here are the examples as requested. I am aware why this doesn't work, but it is the basic premise of what needs to be achieved. My source code is very long and unfortunately confidential, so it is not going to help here. In summary, script A is running Tkinter and gathering data, while script B is views.py as a part of Django, but I'm hoping this can be achieved as a part of Python.
Script A
import time i = 0 def return_data(): return i if __name__ == "__main__": while True: i = i + 1 print i time.sleep(.01)
Script B
import time from scriptA import return_data if __name__ == '__main__': while True: print return_data() # from script A time.sleep(1)
you can use multiprocessing module to implement a Pipe between the two modules. Then you can start one of the modules as a Process and use the Pipe to communicate with it. The best part about using pipes is you can also pass python objects like dict,list through it.
You can create a fourth python file d.py in the same folder as other 3 python files, which imports the other 3 python files and runs their functions, as shown below. In this article, we have learnt how to run multiple python files.
Use the execfile() Method to Run a Python Script in Another Python Script. The execfile() function executes the desired file in the interpreter. This function only works in Python 2. In Python 3, the execfile() function was removed, but the same thing can be achieved in Python 3 using the exec() method.
you can use multiprocessing
module to implement a Pipe
between the two modules. Then you can start one of the modules as a Process and use the Pipe to communicate with it. The best part about using pipes is you can also pass python objects like dict,list through it.
Ex: mp2.py:
from multiprocessing import Process,Queue,Pipe from mp1 import f if __name__ == '__main__': parent_conn,child_conn = Pipe() p = Process(target=f, args=(child_conn,)) p.start() print(parent_conn.recv()) # prints "Hello"
mp1.py:
from multiprocessing import Process,Pipe def f(child_conn): msg = "Hello" child_conn.send(msg) child_conn.close()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With