我从github下载了scrapy-redis并按照说明运行它,但它失败并给出了这个错误:
2013-01-04 17:38:50+0800 [-] ERROR: Unhandled error in Deferred:
2013-01-04 17:38:50+0800 [-] Unhandled Error
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/Scrapy-0.16.3-py2.7.egg/scrapy/cmdline.py", line 138, in _run_command
cmd.run(args, opts)
File "/usr/local/lib/python2.7/dist-packages/Scrapy-0.16.3-py2.7.egg/scrapy/commands/crawl.py", line 44, in run
self.crawler.crawl(spider)
File "/usr/local/lib/python2.7/dist-packages/Scrapy-0.16.3-py2.7.egg/scrapy/crawler.py", line 47, in crawl
return self.engine.open_spider(spider, requests)
File "/usr/local/lib/python2.7/dist-packages/Twisted-12.2.0-py2.7-linux-i686.egg/twisted/internet/defer.py", line 1187, in unwindGenerator
return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
File "/usr/local/lib/python2.7/dist-packages/Twisted-12.2.0-py2.7-linux-i686.egg/twisted/internet/defer.py", line 1045, in _inlineCallbacks
result = g.send(result)
File "/usr/local/lib/python2.7/dist-packages/Scrapy-0.16.3-py2.7.egg/scrapy/core/engine.py", line 218, in open_spider
scheduler = self.scheduler_cls.from_crawler(self.crawler)
exceptions.AttributeError: type object 'Scheduler' has no attribute 'from_crawler'
如何处理?谢谢。