到目前为止,每当我需要使用时,multiprocessing
我都是通过手动创建一个“进程池”并与所有子进程共享一个工作队列来完成的。
例如:
from multiprocessing import Process, Queue
class MyClass:
def __init__(self, num_processes):
self._log = logging.getLogger()
self.process_list = []
self.work_queue = Queue()
for i in range(num_processes):
p_name = 'CPU_%02d' % (i+1)
self._log.info('Initializing process %s', p_name)
p = Process(target = do_stuff,
args = (self.work_queue, 'arg1'),
name = p_name)
通过这种方式,我可以将内容添加到队列中,这些内容将被子进程消耗。然后我可以通过检查以下内容来监控处理的进度Queue.qsize()
:
while True:
qsize = self.work_queue.qsize()
if qsize == 0:
self._log.info('Processing finished')
break
else:
self._log.info('%d simulations still need to be calculated', qsize)
现在我认为这multiprocessing.Pool
可以大大简化这段代码。
我找不到的是如何监控仍有待完成的“工作”量。
举个例子:
from multiprocessing import Pool
class MyClass:
def __init__(self, num_processes):
self.process_pool = Pool(num_processes)
# ...
result_list = []
for i in range(1000):
result = self.process_pool.apply_async(do_stuff, ('arg1',))
result_list.append(result)
# ---> here: how do I monitor the Pool's processing progress?
# ...?
有任何想法吗?