7

我刚刚开始使用 Python3.4 中的 asyncio 库并编写了一个小程序,尝试一次同时获取 50 个网页。该程序在数百个请求后因“打开的文件过多”异常而崩溃。

我认为我的 fetch 方法使用 'response.read_and_close()' 方法调用关闭了连接。

有什么想法吗?我是否以正确的方式解决这个问题?

import asyncio
import aiohttp

@asyncio.coroutine
def fetch(url):
    response = yield from aiohttp.request('GET', url)
    response = yield from response.read_and_close()
    return response.decode('utf-8')

@asyncio.coroutine
def print_page(url):
    page = yield from fetch(url)
    # print(page)

@asyncio.coroutine
def process_batch_of_urls(round, urls):
  print("Round starting: %d" % round)
  coros = []
  for url in urls:
      coros.append(asyncio.Task(print_page(url)))
  yield from asyncio.gather(*coros)
  print("Round finished: %d" % round)

@asyncio.coroutine
def process_all():
  api_url = 'https://google.com'
  for i in range(10):
    urls = []
    for url in range(50):
      urls.append(api_url)
    yield from process_batch_of_urls(i, urls)


loop = asyncio.get_event_loop()
loop.run_until_complete(process_all())

我得到的错误是:

Traceback (most recent call last):
  File "/usr/local/lib/python3.4/site-packages/aiohttp/client.py", line 106, in request
  File "/usr/local/lib/python3.4/site-packages/aiohttp/connector.py", line 135, in connect
  File "/usr/local/lib/python3.4/site-packages/aiohttp/connector.py", line 242, in _create_connection
  File "/usr/local/Cellar/python3/3.4.1/Frameworks/Python.framework/Versions/3.4/lib/python3.4/asyncio/base_events.py", line 424, in create_connection
  File "/usr/local/Cellar/python3/3.4.1/Frameworks/Python.framework/Versions/3.4/lib/python3.4/asyncio/base_events.py", line 392, in create_connection
  File "/usr/local/Cellar/python3/3.4.1/Frameworks/Python.framework/Versions/3.4/lib/python3.4/socket.py", line 123, in __init__
OSError: [Errno 24] Too many open files

During handling of the above exception, another exception occurred:
4

2 回答 2

5

啊哈,我想你的问题。

显式连接器绝对可以解决问题。

https://github.com/KeepSafe/aiohttp/pull/79也应该为隐式连接器修复它。

非常感谢您在aiohttp中发现资源泄漏

升级版aiohttp 0.8.2 没有问题。

于 2014-06-22T14:36:51.897 回答
3

好的,我终于让它工作了。

原来我不得不使用一个 TCPConnector 来汇集连接。

所以我做了这个变量:

connector = aiohttp.TCPConnector(share_cookies=True, loop=loop)

并将其传递给每个获取请求。我的新 fetch 例程如下所示:

@asyncio.coroutine
def fetch(url):
  data = ""
  try:
    yield from asyncio.sleep(1)
    response = yield from aiohttp.request('GET', url, connector=connector)
  except Exception as exc:
      print('...', url, 'has error', repr(str(exc)))
  else:
      data = (yield from response.read()).decode('utf-8', 'replace')
      response.close()

  return data
于 2014-06-20T03:52:24.953 回答