您应该能够安全地组合asyncio
并且multiprocessing
没有太多麻烦,尽管您不应该multiprocessing
直接使用。asyncio
(以及任何其他基于事件循环的异步框架)的主要罪过是阻塞事件循环。如果你尝试multiprocessing
直接使用,任何时候你阻塞等待子进程,你都会阻塞事件循环。显然,这很糟糕。
避免这种情况的最简单方法是使用BaseEventLoop.run_in_executor
在concurrent.futures.ProcessPoolExecutor
. ProcessPoolExecutor
是一个使用 实现的进程池multiprocessing.Process
,但asyncio
内置支持在其中执行函数而不阻塞事件循环。这是一个简单的例子:
import time
import asyncio
from concurrent.futures import ProcessPoolExecutor
def blocking_func(x):
time.sleep(x) # Pretend this is expensive calculations
return x * 5
@asyncio.coroutine
def main():
#pool = multiprocessing.Pool()
#out = pool.apply(blocking_func, args=(10,)) # This blocks the event loop.
executor = ProcessPoolExecutor()
out = yield from loop.run_in_executor(executor, blocking_func, 10) # This does not
print(out)
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
在大多数情况下,仅此功能就足够了。如果您发现自己需要来自multiprocessing
、Queue
、等的其他构造Event
,则有一个名为(完全公开:我编写的)Manager
的第三方库,它提供所有数据结构的兼容版本。这是一个演示示例:aioprocessing
asyncio
multiprocessing
import time
import asyncio
import aioprocessing
import multiprocessing
def func(queue, event, lock, items):
with lock:
event.set()
for item in items:
time.sleep(3)
queue.put(item+5)
queue.close()
@asyncio.coroutine
def example(queue, event, lock):
l = [1,2,3,4,5]
p = aioprocessing.AioProcess(target=func, args=(queue, event, lock, l))
p.start()
while True:
result = yield from queue.coro_get()
if result is None:
break
print("Got result {}".format(result))
yield from p.coro_join()
@asyncio.coroutine
def example2(queue, event, lock):
yield from event.coro_wait()
with (yield from lock):
yield from queue.coro_put(78)
yield from queue.coro_put(None) # Shut down the worker
if __name__ == "__main__":
loop = asyncio.get_event_loop()
queue = aioprocessing.AioQueue()
lock = aioprocessing.AioLock()
event = aioprocessing.AioEvent()
tasks = [
asyncio.async(example(queue, event, lock)),
asyncio.async(example2(queue, event, lock)),
]
loop.run_until_complete(asyncio.wait(tasks))
loop.close()