28

在 Python 中,我看到了许多调用多处理但目标只是打印一些东西的例子。我有一个场景,其中目标返回 2 个变量,我需要稍后使用。例如:

def foo(some args):
   a = someObject
   b = someObject
   return a,b

p1=multiprocess(target=foo,args(some args))
p2=multiprocess(target=foo,args(some args))
p3=multiprocess(target=foo,args(some args))

怎么办?我可以执行 .start 和 .join,但如何检索单个结果?我需要为我执行的所有作业捕获返回 a,b 然后处理它。

4

6 回答 6

30

您正在寻找使用多个进程进行一些令人尴尬的并行工作,那么为什么不使用Pool? APool将负责启动流程、检索结果并将结果返回给您。

我使用pathos,它有一个 fork multiprocessing,因为它比标准库提供的版本具有更好的序列化。

(.py) 文件

from pathos.multiprocessing import ProcessingPool as Pool

def foo(obj1, obj2):
    a = obj1.x**2
    b = obj2.x**2
    return a,b

class Bar(object):
    def __init__(self, x):
        self.x = x

Pool().map(foo, [Bar(1),Bar(2),Bar(3)], [Bar(4),Bar(5),Bar(6)])

结果

[(1, 16), (4, 25), (9, 36)]

您会看到它foo接受两个参数,并返回两个对象的元组。的map方法Pool提交foo给底层进程并返回结果为res.

你可以在pathos这里:https ://github.com/uqfoundation

于 2015-03-14T15:22:32.037 回答
22

是的,当然 - 您可以使用多种方法。最简单的方法之一是共享Queue. 在此处查看示例:http: //eli.thegreenplace.net/2012/01/16/python-parallelizing-cpu-bound-tasks-with-multiprocessing/

于 2012-05-29T11:15:28.333 回答
10

我直接从文档中复制了这个例子,因为我不能给你一个直接的链接。请注意,它会从 done_queue 中打印出结果,但您可以用它做任何您喜欢的事情。

#
# Simple example which uses a pool of workers to carry out some tasks.
#
# Notice that the results will probably not come out of the output
# queue in the same in the same order as the corresponding tasks were
# put on the input queue.  If it is important to get the results back
# in the original order then consider using `Pool.map()` or
# `Pool.imap()` (which will save on the amount of code needed anyway).
#
# Copyright (c) 2006-2008, R Oudkerk
# All rights reserved.
#

import time
import random

from multiprocessing import Process, Queue, current_process, freeze_support

#
# Function run by worker processes
#

def worker(input, output):
    for func, args in iter(input.get, 'STOP'):
        result = calculate(func, args)
        output.put(result)

#
# Function used to calculate result
#

def calculate(func, args):
    result = func(*args)
    return '%s says that %s%s = %s' % \
        (current_process().name, func.__name__, args, result)

#
# Functions referenced by tasks
#

def mul(a, b):
    time.sleep(0.5*random.random())
    return a * b

def plus(a, b):
    time.sleep(0.5*random.random())
    return a + b

#
#
#

def test():
    NUMBER_OF_PROCESSES = 4
    TASKS1 = [(mul, (i, 7)) for i in range(20)]
    TASKS2 = [(plus, (i, 8)) for i in range(10)]

    # Create queues
    task_queue = Queue()
    done_queue = Queue()

    # Submit tasks
    for task in TASKS1:
        task_queue.put(task)

    # Start worker processes
    for i in range(NUMBER_OF_PROCESSES):
        Process(target=worker, args=(task_queue, done_queue)).start()

    # Get and print results
    print 'Unordered results:'
    for i in range(len(TASKS1)):
        print '\t', done_queue.get()

    # Add more tasks using `put()`
    for task in TASKS2:
        task_queue.put(task)

    # Get and print some more results
    for i in range(len(TASKS2)):
        print '\t', done_queue.get()

    # Tell child processes to stop
    for i in range(NUMBER_OF_PROCESSES):
        task_queue.put('STOP')


if __name__ == '__main__':
    freeze_support()
    test()

它最初来自多处理模块 docs

于 2012-05-29T11:19:41.863 回答
5

为什么没有人使用multiprocessing.Pool 的回调?

例子:

from multiprocessing import Pool
from contextlib import contextmanager

from pprint import pprint
from requests import get as get_page

@contextmanager
def _terminating(thing):
    try:
        yield thing
    finally:
        thing.terminate()

def _callback(*args, **kwargs):
    print("CALBACK")
    pprint(args)
    pprint(kwargs)

print("Processing...")
with _terminating(Pool(processes=WORKERS)) as pool:
    results = pool.map_async(get_page, URLS, callback=_callback)

    start_time = time.time()
    results.wait()
    end_time = time.time()
    print("Time for Processing: %ssecs" % (end_time - start_time))

在这里,我打印了 args 和 kwargs。但是您可以通过以下方式替换回调

def _callback2(responses):
    for r in responses:
        print(r.status_code) # or do whatever with response...
于 2017-05-21T13:31:10.810 回答
4

它不适用于 Windows,但这是我的函数多处理装饰器,它返回一个队列,您可以轮询并从中收集返回的数据

import os
from Queue import Queue
from multiprocessing import Process

def returning_wrapper(func, *args, **kwargs):
    queue = kwargs.get("multiprocess_returnable")
    del kwargs["multiprocess_returnable"]
    queue.put(func(*args, **kwargs))

class Multiprocess(object):
    """Cute decorator to run a function in multiple processes."""
    def __init__(self, func):
        self.func = func
        self.processes = []

    def __call__(self, *args, **kwargs):
        num_processes = kwargs.get("multiprocess_num_processes", 2) # default to two processes.
        return_obj = kwargs.get("multiprocess_returnable", Queue()) # default to stdlib Queue
        kwargs["multiprocess_returnable"] = return_obj
        for i in xrange(num_processes):
            pro = Process(target=returning_wrapper, args=tuple([self.func] + list(args)), kwargs=kwargs)
            self.processes.append(pro)
            pro.start()
        return return_obj


@Multiprocess
def info():
    print 'module name:', __name__
    print 'parent process:', os.getppid()
    print 'process id:', os.getpid()
    return 4 * 22

data = info()
print data.get(False)
于 2012-05-29T11:40:37.343 回答
2

这是一个多进程搜索大文件的例子。

于 2013-12-31T13:33:38.840 回答