我正在尝试查询约 50 个维基百科页面。我一直在使用 requests 包来发出 GET 请求,但我一直在努力实现 grequests,因为我听说它的性能要好得多。
对我来说,性能提升真的很小。难道我做错了什么?
import requests
import grequests
from urllib.parse import quote
from time import time
url = 'https://en.wikipedia.org/w/api.php?action=query&titles={0}&prop=pageprops&ppprop=disambiguation&format=json'
titles = ['Harriet Tubman', 'Car', 'Underground Railroad', 'American Civil War', 'Kate Larson']
urls = [url.format(quote(title)) for title in titles]
def sync_test(urls):
results = []
s = time()
for url in urls:
results.append(requests.get(url))
e = time()
return e-s
def async_test(urls):
s = time()
results = grequests.map((grequests.get(url) for url in urls))
e = time()
return e-s
def iterate(urls, num):
sync_time = 0
async_time = 0
for i in range(num):
sync_time += sync_test(urls)
async_time += async_test(urls)
print("sync_time: {}\nasync_time: {}".format(sync_time, async_time))
输出:sync_time:8.945282936096191 async_time:7.97578239440918
谢谢!