How can I write data to a file in Python thread-safe? I want to safe some variables to a file for every request and every hour I want to do some grouping and write it to mysql.
In Java I now put it in an Array which is cached and this is written to a file when the array is full.
How can I do this in Python? There are many concurrent requests so it has to be thread-safe.
EDIT:
We ended up using the logging module which works fine.
We used the logging module:
import logging
logpath = "/tmp/log.log"
logger = logging.getLogger('log')
logger.setLevel(logging.INFO)
ch = logging.FileHandler(logpath)
ch.setFormatter(logging.Formatter('%(message)s'))
logger.addHandler(ch)
def application(env, start_response):
logger.info("%s %s".format("hello","world!"))
start_response('200 OK', [('Content-Type', 'text/html')])
return ["Hello!"]
I've made a simple writer, that uses threading
and Queue
and works fine with multiple threads. Pros: teoreticaly it can aссept data from multiple processes without blocking them, and write asynconiosly in other thread. Cons: additional thread for writing consumes resourses; in CPython threading
doesn't give real multithreading.
from queue import Queue, Empty
from threading import Thread
class SafeWriter:
def __init__(self, *args):
self.filewriter = open(*args)
self.queue = Queue()
self.finished = False
Thread(name = "SafeWriter", target=self.internal_writer).start()
def write(self, data):
self.queue.put(data)
def internal_writer(self):
while not self.finished:
try:
data = self.queue.get(True, 1)
except Empty:
continue
self.filewriter.write(data)
self.queue.task_done()
def close(self):
self.queue.join()
self.finished = True
self.filewriter.close()
#use it like ordinary open like this:
w = SafeWriter("filename", "w")
w.write("can be used among multiple threads")
w.close() #it is really important to close or the program would not end
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With