2

我的scrapy项目在我的本地机器上运行良好。但是,在部署到 Scrapinghub 时出现错误:

$ shub deploy

Packing version 88e88d8-master
Deploying to Scrapy Cloud project "8888888"
Deploy log last 30 lines:
  File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode
    _run(args, settings)
  File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 103, in _run
    _run_scrapy(args, settings)
  File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy
    execute(settings=settings)
  File "/usr/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 148, in execute
    cmd.crawler_process = CrawlerProcess(settings)
  File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 243, in __init__
    super(CrawlerProcess, self).__init__(settings)
  File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 134, in __init__
    self.spider_loader = _get_spider_loader(settings)
  File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 330, in _get_spider_loader
    return loader_cls.from_settings(settings.frozencopy())
  File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 61, in from_settings
    return cls(settings)
  File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 25, in __init__
    self._load_all_spiders()
  File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 47, in _load_all_spiders
    for module in walk_modules(name):
  File "/usr/local/lib/python2.7/site-packages/scrapy/utils/misc.py", line 71, in walk_modules
    submod = import_module(fullpath)
  File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module
    __import__(name)
  File "/app/__main__.egg/mycrawler/spiders/first_spider.py", line 4, in <module>
  File "/app/__main__.egg/mycrawler/items.py", line 4, in <module>
ImportError: No module named myCrawlerHelper
{"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}

{"status": "error", "message": "Internal error"}
Deploy log location: /var/folders/1w/x1jxnccs57d9h60kwwnsk83c0000gn/T/shub_deploy_irxzg9x8.log
Error: Deploy failed: b'{"status": "error", "message": "Internal error"

我将一些辅助函数打包到文件 myCrawlerHelper.py 中,并将它们导入到我的蜘蛛和 items.py 中。我相信问题与此有关。

我也使用飞溅。我还注意到错误消息包括 python 2.7,虽然我使用的是 3.6

我怎样才能摆脱这个问题?

4

0 回答 0