I am trying to make multiprocessing and socket programming work together, but, I am stuck at this point. Problem is that, I am getting this error:
File "multiprocesssockserv.py", line 11, in worker
clientsocket = socket.fromfd(clientfileno, socket.AF_INET, socket.SOCK_STREAM)
error: [Errno 9] Bad file descriptor
Complete code that causing the error is as following:
import multiprocessing as mp
import logging
import socket
logger = mp.log_to_stderr(logging.WARN)
def worker(queue):
while True:
clientfileno = queue.get()
print clientfileno
clientsocket = socket.fromfd(clientfileno, socket.AF_INET, socket.SOCK_STREAM)
clientsocket.recv()
clientsocket.send("Hello World")
clientsocket.close()
if __name__ == '__main__':
num_workers = 5
socket_queue = mp.Queue()
workers = [mp.Process(target=worker, args=(socket_queue,)) for i in
range(num_workers)]
for p in workers:
p.daemon = True
p.start()
serversocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
serversocket.bind(('',9090))
serversocket.listen(5)
while True:
client, address = serversocket.accept()
socket_queue.put(client.fileno())
edit: I am using socket.fromfd because I can't put sockets into a queue :) I need a way to access same sockets from different processes somehow. That is the core of my problem.
Python provides a multiprocessing module that includes an API, similar to the threading module, to divide the program into multiple processes. Let us see an example, print("END!")
What is Multi-threading Socket Programming? Multithreading is a process of executing multiple threads simultaneously in a single process. A _thread module & threading module is used for multi-threading in python, these modules help in synchronization and provide a lock to a thread in use. A lock has two states, “locked” or “unlocked”.
When the socket is ready for writing, which should always be the case for a healthy socket, any received data stored in data.outb is echoed to the client using sock.send (). The bytes sent are then removed from the send buffer: Now let’s look at the multi-connection client, multiconn-client.py.
As you can see, the current_process () method gives us the name of the process that calls our function. See what happens when we don’t assign a name to one of the processes: Well, the Python Multiprocessing Module assigns a number to each process as a part of its name when we don’t.
After working on this for a while, I decided to approach this problem from a different angle, and following method seems to be working for me.
import multiprocessing as mp
import logging
import socket
import time
logger = mp.log_to_stderr(logging.DEBUG)
def worker(socket):
while True:
client, address = socket.accept()
logger.debug("{u} connected".format(u=address))
client.send("OK")
client.close()
if __name__ == '__main__':
num_workers = 5
serversocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
serversocket.bind(('',9090))
serversocket.listen(5)
workers = [mp.Process(target=worker, args=(serversocket,)) for i in
range(num_workers)]
for p in workers:
p.daemon = True
p.start()
while True:
try:
time.sleep(10)
except:
break
I'm not an expert so I can't give the real explanation but if you want to use queues, you need to reduce the handle and then recreate it:
in your main :
client, address = serversocket.accept()
client_handle = multiprocessing.reduction.reduce_handle(client.fileno())
socket_queue.put(client_handle)
and in your worker:
clientHandle = queue.get()
file_descriptor = multiprocessing.reduction.rebuild_handle(client_handle)
clientsocket = socket.fromfd(file_descriptor, socket.AF_INET, socket.SOCK_STREAM)
also
import multiprocessing.reduction
That will work with your original code. However, I am currently having problems with closing sockets in worker processes after they were created as I described.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With