1

我是异步编程的新手。我正在尝试编写用于检查网页状态的脚本。当然,我想这样做是异步的。我的片段:

import aiohttp
import asyncio

url_site = 'http://anysite.com'
fuzz_file = 'fuzz.txt'


def generate_links(file):
    with open(file) as f:
        return [str(url_site) + str(line.strip()) for line in f]

async def fetch_page(client, url):
    async with client.get(url) as response:
        return response.status

async def run():
    links = generate_links(fuzz_file)
    for f,link in asyncio.as_completed([fetch_page(client,link) for link in links]):
        print("[INFO] [{}] {}".format(f, link))


loop = asyncio.get_event_loop()
conn = aiohttp.ProxyConnector(proxy="http://10.7.0.35:8080")
client = aiohttp.ClientSession(loop=loop, connector=conn)
loop.run_until_complete(run())
client.close()

但是我遇到了下一个错误:Task was destroyed but it is pending! 有人可以指出我犯错的地方吗?

4

1 回答 1

1

从文档中as_completed

返回一个迭代器,其值在等待时是 Future 实例。

所以你必须await返回的每个对象as_completed

for f in asyncio.as_completed([fetch_page(client,link) for link in links]):
    status = await f

您可能还想研究等待更细粒度的控制。

于 2015-11-26T20:46:21.943 回答