I am working with Python 3.6.1 on Jupyter 5. My goal is to test how portalocker manage concurrent appending on the same file.
To accomplish that I have made a simple function that appends a single line to the same file and I use multiprocessing.Pool and Pool.map() to run the function in parallel.
Here is the code in Jupyter notebook.
cell 1
from time import time
from multiprocessing import Pool
import portalocker
def f(*args):
while time() < start + 1:
pass
with open('portalocker_test.txt', 'a') as f:
portalocker.lock(f, portalocker.LOCK_EX)
f.write(f'{time()}\n')
cell 2
start = time()
with Pool(4) as p:
p.map(f, range(4))
cell 3
with open('portalocker_test.txt', 'r') as f:
for line in f:
print(line, end='')
If I run this code once I get the expected result:
Out of cell 3:
1495614277.189394
1495614277.1893928
1495614277.1893911
1495614277.1894028
But if I run cell 2 again (without restarting the notebook) I get:
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-5-db9c07d32724> in <module>()
1 start = time()
2 with Pool(4) as p:
----> 3 p.map(f, range(4))
/Users/xxx/Homebrew/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/multiprocessing/pool.py in map(self, func, iterable, chunksize)
258 in a list that is returned.
259 '''
--> 260 return self._map_async(func, iterable, mapstar, chunksize).get()
261
262 def starmap(self, func, iterable, chunksize=None):
/Users/xxx/Homebrew/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/multiprocessing/pool.py in get(self, timeout)
606 return self._value
607 else:
--> 608 raise self._value
609
610 def _set(self, i, obj):
/Users/xxx/Homebrew/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/multiprocessing/pool.py in _handle_tasks(taskqueue, put, outqueue, pool, cache)
383 break
384 try:
--> 385 put(task)
386 except Exception as e:
387 job, ind = task[:2]
/Users/xxx/Homebrew/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/multiprocessing/connection.py in send(self, obj)
204 self._check_closed()
205 self._check_writable()
--> 206 self._send_bytes(_ForkingPickler.dumps(obj))
207
208 def recv_bytes(self, maxlength=None):
/Users/xxx/Homebrew/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/multiprocessing/reduction.py in dumps(cls, obj, protocol)
49 def dumps(cls, obj, protocol=None):
50 buf = io.BytesIO()
---> 51 cls(buf, protocol).dump(obj)
52 return buf.getbuffer()
53
TypeError: cannot serialize '_io.TextIOWrapper' object
The same error gets raised if I read the file before running cell 2. So, If I never open the file before running cell 2, all goes fine. If I open the file before, then I get that error. This is pretty inconsistent to me. What is going on? How to solve it?
Also, using or not portalocker will not change this behavior, so it is not portalocker the problem. I haven't check it on plain python but I am really interested in running it with Jupyter.
the problem is that you should avoid same names for different objects, in your case should help
changing function name from f
to function
(or another name different from f
)
cell 1
from time import time
from multiprocessing import Pool
import portalocker
def function(*args):
while time() < start + 1:
pass
with open('portalocker_test.txt', 'a') as f:
portalocker.lock(f, portalocker.LOCK_EX)
f.write(f'{time()}\n')
cell 2
start = time()
with Pool(4) as p:
p.map(function, range(4))
or
renaming file objects obtained with open
from f
to file
(or another name different from f
):
cell 1
from time import time
from multiprocessing import Pool
import portalocker
def f(*args):
while time() < start + 1:
pass
with open('portalocker_test.txt', 'a') as file:
portalocker.lock(file, portalocker.LOCK_EX)
file.write(f'{time()}\n')
cell 3
with open('portalocker_test.txt', 'r') as file:
for line in file:
print(line, end='')
or both
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With