Description
Bug report
Checklist
- I am confident this is a bug in CPython, not a bug in a third-party project
- I have searched the CPython issue tracker,
and am confident this bug has not been reported before
CPython versions tested on:
3.11
Operating systems tested on:
Linux
Output from running 'python -VV' on the command line:
Python 3.11.5 (main, Aug 26 2023, 00:26:34) [GCC 12.2.1 20220924]
A clear and concise description of the bug:
Using nested multiprocessing
(i.e. spawn a child process inside a child process) is broken as of Python 3.11.5, leading to an attribute error.
Process Process-1:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/usr/local/lib/python3.11/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/io/pyi_multiprocessing_nested_process.py", line 15, in process_function
process.start()
File "/usr/local/lib/python3.11/multiprocessing/process.py", line 121, in start
self._popen = self._Popen(self)
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/multiprocessing/context.py", line 224, in _Popen
return _default_context.get_context().Process._Popen(process_obj)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/multiprocessing/context.py", line 288, in _Popen
return Popen(process_obj)
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 32, in __init__
super().__init__(process_obj)
File "/usr/local/lib/python3.11/multiprocessing/popen_fork.py", line 19, in __init__
self._launch(process_obj)
File "/usr/local/lib/python3.11/multiprocessing/popen_spawn_posix.py", line 47, in _launch
reduction.dump(process_obj, fp)
File "/usr/local/lib/python3.11/multiprocessing/reduction.py", line 60, in dump
ForkingPickler(file, protocol).dump(obj)
File "/usr/local/lib/python3.11/multiprocessing/synchronize.py", line 106, in __getstate__
if self.is_fork_ctx:
^^^^^^^^^^^^^^^^
AttributeError: 'Lock' object has no attribute 'is_fork_ctx'
Results: [1]
Minimal code example below. Invoke with argument fork
, forkserver
or spawn
. fork
will work. forkserver
and spawn
will both raise the above error. All three variants work with Python 3.11.4.
import sys
import multiprocessing
def nested_process_function(queue):
print("Running nested sub-process!")
queue.put(2)
def process_function(queue):
print("Running sub-process!")
queue.put(1)
process = multiprocessing.Process(target=nested_process_function, args=(queue,))
process.start()
process.join()
def main(start_method):
multiprocessing.set_start_method(start_method)
queue = multiprocessing.Queue()
process = multiprocessing.Process(target=process_function, args=(queue,))
process.start()
process.join()
results = []
while not queue.empty():
results.append(queue.get())
print(f"Results: {results}")
assert results == [1, 2]
if __name__ == '__main__':
if len(sys.argv) != 2:
raise SystemExit(f"Usage: {sys.argv[0]} fork|forkserver|spawn")
main(sys.argv[1])
I believe that the source of this regression is 34ef75d which adds the attribute is_fork_ctx
to multiprocessing.Lock()
but doesn't update the pickle methods (__getstate__()
and __setstate__()
) so after being serialised and deserialised, the Lock()
object looses that attribute.
The following patch, adding is_fork_ctx
to the pickle methods, makes the above work again.
diff --git a/Lib/multiprocessing/synchronize.py b/Lib/multiprocessing/synchronize.py
index 2328d33212..9c5c2aada6 100644
--- a/Lib/multiprocessing/synchronize.py
+++ b/Lib/multiprocessing/synchronize.py
@@ -109,10 +109,11 @@ def __getstate__(self):
'not supported. Please use the same context to create '
'multiprocessing objects and Process.')
h = sl.handle
- return (h, sl.kind, sl.maxvalue, sl.name)
+ return (h, sl.kind, sl.maxvalue, sl.name, self.is_fork_ctx)
def __setstate__(self, state):
- self._semlock = _multiprocessing.SemLock._rebuild(*state)
+ self._semlock = _multiprocessing.SemLock._rebuild(*state[:4])
+ self.is_fork_ctx = state[4]
util.debug('recreated blocker with handle %r' % state[0])
self._make_methods()
### Tasks