0

我有一个应用程序,它遍历来自 Postgres 表的批量 URL:s,下载 URL,在每次下载时运行处理函数并将处理结果保存到表中。

我使用 aiopg 和 aiohttp 编写了它以使其异步运行。在简化形式中,它看起来像:

import asyncio
import aiopg
from aiohttp import ClientSession, TCPConnector

BATCH_SIZE = 100
dsn = "dbname=events user={} password={} host={}".format(DB_USER, DB_PASSWORD, DB_HOST)    

async def run():
    async with ClientSession(connector=TCPConnector(ssl=False, limit=100)) as session:
        async with aiopg.create_pool(dsn) as pool:
            while True:
                count = await run_batch(session, pool)
                if count == 0:
                    break

async def run_batch(session, db_pool):
    tasks = []
    async for url in get_batch(db_pool):
        task = asyncio.ensure_future(process_url(url, session, db_pool))
        tasks.append(task)
    await asyncio.gather(*tasks)

async def get_batch(db_pool):
    sql = "SELECT id, url FROM db.urls ... LIMIT %s"
    async with db_pool.acquire() as conn:
        async with conn.cursor() as cur:
            await cur.execute(sql, (BATCH_SIZE,))
            for row in cur:
                yield row

async def process_url(url, session, db_pool):
    async with session.get(url, timeout=15) as response:
        body = await response.read()
        data = process_body(body)
        await save_data(db_pool, data)

async def process_body(body):
    ...
    return data

async def save_data(db_pool, data):
    sql = "UPDATE db.urls ..."
    async with db_pool.acquire() as conn:
        async with conn.cursor() as cur:
            await cur.execute(sql, (data,))

但有些不对劲。脚本运行的时间越长,运行的越慢,调用session.get. 我的猜测是我使用 Postgres 连接的方式有问题,但我不知道出了什么问题。任何帮助将非常感激!

4

0 回答 0