我正在尝试抓取标签 #nationaldoughnutday 的所有推文,但由于速率限制而未能这样做。
参考下面的代码,我尝试将代码放在一个while循环中,以便当速率限制重置时,我可以从最后一次抓取的日期(直到日期)恢复抓取
但是我一直反复收到此错误,并且我的爬虫在睡了很长时间后似乎没有重新开始爬行。
TweepError Failed to send request: ('Connection aborted.', error (10054, 'An existing connection was forcibly closed by the remote host'))
Sleeping...
TweepError Failed to send request: ('Connection aborted.', error (10054, 'An existing connection was forcibly closed by the remote host'))
Sleeping...
TweepError Failed to send request: ('Connection aborted.', error (10054, 'An existing connection was forcibly closed by the remote host'))
我试图删除内部的 try catch 循环,但也没有帮助
auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
auth.set_access_token(access_token, access_token_secret)
api = tweepy.API(auth, wait_on_rate_limit=True,wait_on_rate_limit_notify=True)
query = '#nationaldoughnutday'
untill_date = '01-07-2019'
while True:
try: #outer try catch
tweets = tweepy.Cursor(api.search, q=query + '-filter:retweets', rpp=100, lang='en',tweet_mode='extended',until = until_date).items()
for tweet in tweets:
try: #inner try catch
print "tweet : ", tweet.created_at
#this is so that if i reconnect with cursor, i will start with the date before the last crawled tweet
until_date = tweet.created_at.date() - datetime.timedelta(days=1)
except tweepy.TweepError as e:
print 'Inner TweepyError', e
time.sleep(17 * 60)
break
except tweepy.TweepError as e:
print 'Inner TweepyError',
print "sleeping ...."
time.sleep(17 * 60)
continue
except StopIteration:
break
先感谢您!