51

有没有办法在 Spider 类中的方法终止之前触发它?

我可以自己终止蜘蛛,如下所示:

class MySpider(CrawlSpider):
    #Config stuff goes here...

    def quit(self):
        #Do some stuff...
        raise CloseSpider('MySpider is quitting now.')

    def my_parser(self, response):
        if termination_condition:
            self.quit()

        #Parsing stuff goes here...

但是我找不到任何关于如何确定蜘蛛何时要自然退出的信息。

4

6 回答 6

79

看起来您可以通过dispatcher.

我会尝试类似的东西:

from scrapy import signals
from scrapy.xlib.pydispatch import dispatcher

class MySpider(CrawlSpider):
    def __init__(self):
        dispatcher.connect(self.spider_closed, signals.spider_closed)

    def spider_closed(self, spider):
      # second param is instance of spder about to be closed.

在较新版本的 scrapyscrapy.xlib.pydispatch中已弃用。相反,您可以使用 from pydispatch import dispatcher

于 2012-09-12T18:40:11.720 回答
53

只是为了更新,你可以closed像这样调用函数:

class MySpider(CrawlSpider):
    def closed(self, reason):
        do-something()
于 2015-10-23T22:29:51.030 回答
18

对于 Scrapy 版本1.0.0+(它也可能适用于旧版本)。

from scrapy import signals

class MySpider(CrawlSpider):
    name = 'myspider'

    @classmethod
    def from_crawler(cls, crawler, *args, **kwargs):
        spider = super(MySpider, cls).from_crawler(crawler, *args, **kwargs)
        crawler.signals.connect(spider.spider_opened, signals.spider_opened)
        crawler.signals.connect(spider.spider_closed, signals.spider_closed)
        return spider

    def spider_opened(self, spider):
        print('Opening {} spider'.format(spider.name))

    def spider_closed(self, spider):
        print('Closing {} spider'.format(spider.name))

一种很好的用法是将tqdm进度条添加到 scrapy spider。

# -*- coding: utf-8 -*-
from scrapy import signals
from scrapy.linkextractors import LinkExtractor
from scrapy.spiders import CrawlSpider, Rule
from tqdm import tqdm


class MySpider(CrawlSpider):
    name = 'myspider'
    allowed_domains = ['somedomain.comm']
    start_urls = ['http://www.somedomain.comm/ccid.php']

    rules = (
        Rule(LinkExtractor(allow=r'^http://www.somedomain.comm/ccds.php\?id=.*'),
             callback='parse_item',
             ),
        Rule(LinkExtractor(allow=r'^http://www.somedomain.comm/ccid.php$',
                           restrict_xpaths='//table/tr[contains(., "SMTH")]'), follow=True),
    )

    def parse_item(self, response):
        self.pbar.update()  # update progress bar by 1
        item = MyItem()
        # parse response
        return item

    @classmethod
    def from_crawler(cls, crawler, *args, **kwargs):
        spider = super(MySpider, cls).from_crawler(crawler, *args, **kwargs)
        crawler.signals.connect(spider.spider_opened, signals.spider_opened)
        crawler.signals.connect(spider.spider_closed, signals.spider_closed)
        return spider

    def spider_opened(self, spider):
        self.pbar = tqdm()  # initialize progress bar
        self.pbar.clear()
        self.pbar.write('Opening {} spider'.format(spider.name))

    def spider_closed(self, spider):
        self.pbar.clear()
        self.pbar.write('Closing {} spider'.format(spider.name))
        self.pbar.close()  # close progress bar
于 2016-10-12T09:45:11.123 回答
9

对于最新版本(v1.7),只需closed(reason)在您的蜘蛛类中定义方法。

closed(reason)

当蜘蛛关闭时调用。此方法为 spider_closed 信号提供了到 signals.connect() 的快捷方式。

Scrapy 文档:scrapy.spiders.Spider.closed

于 2019-06-13T07:29:12.780 回答
7

对我来说,接受的不起作用/至少对于scrapy 0.19来说已经过时了。我让它与以下内容一起工作:

from scrapy.signalmanager import SignalManager
from scrapy.xlib.pydispatch import dispatcher

class MySpider(CrawlSpider):
    def __init__(self, *args, **kwargs):
        super(MySpider, self).__init__(*args, **kwargs)
        SignalManager(dispatcher.Any).connect(
            self.closed_handler, signal=signals.spider_closed)

    def closed_handler(self, spider):
        # do stuff here
于 2013-09-19T22:17:43.437 回答
1

如果你有很多蜘蛛并且想在它们关闭之前做一些事情,也许在你的项目中添加 statscollector 会很方便。

在设置中:

STATS_CLASS = 'scraper.stats.MyStatsCollector'

和收藏家:

from scrapy.statscollectors import StatsCollector

class MyStatsCollector(StatsCollector):
    def _persist_stats(self, stats, spider):
        do something here
于 2017-04-05T16:04:36.360 回答