4

我是python和scrapy的新手。在将 restrict_xpaths 设置设置为 "//table[@class="lista"]" 后,我收到了以下回溯。奇怪的是,通过使用其他 xpath 规则,爬虫可以正常工作。

Traceback (most recent call last):
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/twisted/internet/base.py", line 800, in runUntilCurrent
    call.func(*call.args, **call.kw)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/twisted/internet/task.py", line 602, in _tick
    taskObj._oneWorkUnit()
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/twisted/internet/task.py", line 479, in _oneWorkUnit
    result = self._iterator.next()
  File "/Library/Python/2.7/site-packages/scrapy/utils/defer.py", line 57, in <genexpr>
    work = (callable(elem, *args, **named) for elem in iterable)
--- <exception caught here> ---
  File "/Library/Python/2.7/site-packages/scrapy/utils/defer.py", line 96, in iter_errback
    yield it.next()
  File "/Library/Python/2.7/site-packages/scrapy/contrib/spidermiddleware/offsite.py", line 23, in process_spider_output
    for x in result:
  File "/Library/Python/2.7/site-packages/scrapy/contrib/spidermiddleware/referer.py", line 22, in <genexpr>
    return (_set_referer(r) for r in result or ())
  File "/Library/Python/2.7/site-packages/scrapy/contrib/spidermiddleware/urllength.py", line 33, in <genexpr>
    return (r for r in result or () if _filter(r))
  File "/Library/Python/2.7/site-packages/scrapy/contrib/spidermiddleware/depth.py", line 50, in <genexpr>
    return (r for r in result or () if _filter(r))
  File "/Library/Python/2.7/site-packages/scrapy/contrib/spiders/crawl.py", line 73, in _parse_response
    for request_or_item in self._requests_to_follow(response):
  File "/Library/Python/2.7/site-packages/scrapy/contrib/spiders/crawl.py", line 52, in _requests_to_follow
    links = [l for l in rule.link_extractor.extract_links(response) if l not in seen]
  File "/Library/Python/2.7/site-packages/scrapy/contrib/linkextractors/sgml.py", line 124, in extract_links
    ).encode(response.encoding)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/encodings/iso8859_2.py", line 12, in encode
    return codecs.charmap_encode(input,errors,encoding_table)
exceptions.UnicodeEncodeError: 'charmap' codec can't encode character u'\xbb' in position 686: character maps to <undefined>

这是 MySpider 类。

from scrapy.contrib.spiders import CrawlSpider, Rule
from scrapy.contrib.linkextractors.sgml import SgmlLinkExtractor
from scrapy.selector import HtmlXPathSelector
from ds_crawl.items import DsCrawlItem

class MySpider(CrawlSpider):
    name = 'inside'
    allowed_domains = ['wroclaw.dlastudenta.pl']
    start_urls = ['http://wroclaw.dlastudenta.pl/stancje/']

    rules = (
        Rule(SgmlLinkExtractor(allow=('show_stancja'), restrict_xpaths=('//table[@class="lista"]')),  callback='parse_item', follow= True),)

    def parse_item(self, response):
        hxs = HtmlXPathSelector(response)
        titles = hxs.select("//p[@class='bbtext intextAd']")
        for titles in titles:
            item = DsCrawlItem()
            item['content'] = titles.select("text()").extract()
            print item

对此错误和帮助的任何解释将不胜感激。谢谢你。

4

1 回答 1

7

这是由网页使用&raquo;​​实体引起的错误,该实体被转换lxml为 unicode 字符\xbb,当您使用restrict_xpaths参数时,链接提取器将内容编码为原始编码,该编码iso8859-2失败,因为\xbb该编码中的字符无效。

这个简单的行重现了异常:

>>> u'\xbb'.encode('iso8859-2')
...
UnicodeEncodeError: 'charmap' codec can't encode character u'\xbb' in position 0: character maps to <undefined>

一种解决方法可以强制utf8用于所有响应。这可以通过一个简单的下载器中间件来完成:

# file: myproject/middlewares.py

class ForceUTF8Response(object):
    """A downloader middleware to force UTF-8 encoding for all responses."""
    encoding = 'utf-8'

    def process_response(self, request, response, spider):
        # Note: Use response.body_as_unicode() instead of response.text in in Scrapy <1.0.
        new_body = response.text.encode(self.encoding)
        return response.replace(body=new_body, encoding=self.encoding)

在您的设置中:

DOWNLOADER_MIDDLEWARES = {
    'myproject.middlewares.ForceUTF8Response': 100,
}
于 2013-10-26T02:56:38.340 回答