我正在尝试使用他们提供的规则简单地将 Scrapy Spider 部署到 ScrapingHub。出于某种原因,它专门搜索 Python 3.6 目录,而它应该能够搜索任何 3.x Python 目录。我的蜘蛛是用 Python 3.5 编写的,这是一个问题。Scrapinghub 说识别“scrapy:1.4-py3”将适用于 3.x Python 集,但这显然不是真的。
另外,由于某种原因,它似乎在项目中找不到我的蜘蛛。这是否与 3.6 目录的问题有关。
最后,我已经安装了需求文件中所需的一切。
C:\Users\Desktop\Empery Code\YahooScrape>shub deploy
Packing version 1.0
Deploying to Scrapy Cloud project "205357"
Deploy log last 30 lines:
Deploy log location: C:\Users\AppData\Local\Temp\shub_deploy_of5_m4
qg.log
Error: Deploy failed: b'{"status": "error", "message": "Internal build error"}'
_run(args, settings)
File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 103, in
_run
_run_scrapy(args, settings)
File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 111, in
_run_scrapy
execute(settings=settings)
File "/usr/local/lib/python3.6/site-packages/scrapy/cmdline.py", line 148, in
execute
cmd.crawler_process = CrawlerProcess(settings)
File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 243, in
__init__
super(CrawlerProcess, self).__init__(settings)
File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 134, in
__init__
self.spider_loader = _get_spider_loader(settings)
File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 330, in
_get_spider_loader
return loader_cls.from_settings(settings.frozencopy())
File "/usr/local/lib/python3.6/site-packages/scrapy/spiderloader.py", line 61,
in from_settings
return cls(settings)
File "/usr/local/lib/python3.6/site-packages/scrapy/spiderloader.py", line 25,
in __init__
self._load_all_spiders()
File "/usr/local/lib/python3.6/site-packages/scrapy/spiderloader.py", line 47,
in _load_all_spiders
for module in walk_modules(name):
File "/usr/local/lib/python3.6/site-packages/scrapy/utils/misc.py", line 63, i
n walk_modules
mod = import_module(path)
File "/usr/local/lib/python3.6/importlib/__init__.py", line 126, in import_mod
ule
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 978, in _gcd_import
File "<frozen importlib._bootstrap>", line 961, in _find_and_load
File "<frozen importlib._bootstrap>", line 948, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'YahooScrape.spiders'
{"message": "list-spiders exit code: 1", "details": null, "error": "build_error"
}
{"status": "error", "message": "Internal build error"}
C:\Users\Desktop\Empery Code\YahooScrape>\
Scrapy.cfg 文件:
# Automatically created by: scrapy startproject
#
# For more information about the [deploy] section see:
# https://scrapyd.readthedocs.org/en/latest/deploy.html
[settings]
default = YahooScrape.settings
[deploy]
#url = http://localhost:6800/
project = YahooScrape
Scrapinghub.yml 代码:
project: -----
requirements:
file: requirements.txt
stacks:
default: scrapy:1.4-py3