当我尝试在命令提示符下执行 scrapy 时,出现以下错误:
C:\Users\XXXXX>scrapy
Traceback (most recent call last):
File "c:\users\XXXXX\anaconda3\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "c:\users\XXXXX\anaconda3\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\Users\XXXXX\Anaconda3\Scripts\scrapy.exe\__main__.py", line 5, in <module>
File "c:\users\XXXXX\anaconda3\lib\site-packages\scrapy\__init__.py", line 34, in <module>
from scrapy.spiders import Spider
File "c:\users\XXXXX\anaconda3\lib\site-packages\scrapy\spiders\__init__.py", line 10, in <module>
from scrapy.http import Request
File "c:\users\XXXXX\anaconda3\lib\site-packages\scrapy\http\__init__.py", line 11, in <module>
from scrapy.http.request.form import FormRequest
File "c:\users\XXXXX\anaconda3\lib\site-packages\scrapy\http\request\form.py", line 16, in <module>
from scrapy.utils.response import get_base_url
File "c:\users\XXXXX\anaconda3\lib\site-packages\scrapy\utils\response.py", line 10, in <module>
from twisted.web import http
File "c:\users\XXXXX\anaconda3\lib\site-packages\twisted\web\http.py", line 102, in <module>
from twisted.internet import interfaces, protocol, address
File "c:\users\XXXXX\anaconda3\lib\site-packages\twisted\internet\address.py", line 101, in <module>
@attr.s(hash=False, repr=False, eq=False)
TypeError: attrs() got an unexpected keyword argument 'eq'
我尝试检查版本并尝试更新它们
点子列表的短输出
C:\Users\XXXXX>pip list
attr 0.3.1
attrs 19.1.0
Scrapy 2.1.0
C:\Users\XXXXX>pip install --upgrade attrs
Collecting attrs
Downloading https://files.pythonhosted.org/packages/a2/db/4313ab3be961f7a763066401fb77f7748373b6094076ae2bda2806988af6/attrs-19.3.0-py2.py3-none-any.whl
Installing collected packages: attrs
Found existing installation: attrs 19.1.0
Uninstalling attrs-19.1.0:
Successfully uninstalled attrs-19.1.0
Successfully installed attrs-19.3.0
C:\Users\XXXXX>pip install --upgrade scrapy
Requirement already up-to-date: scrapy in c:\users\XXXXX\anaconda3\lib\site-packages (2.1.0)
Requirement already satisfied, skipping upgrade: w3lib>=1.17.0 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (1.21.0)
Requirement already satisfied, skipping upgrade: protego>=0.1.15 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (0.1.16)
Requirement already satisfied, skipping upgrade: queuelib>=1.4.2 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (1.5.0)
Requirement already satisfied, skipping upgrade: Twisted>=17.9.0 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (20.3.0)
Requirement already satisfied, skipping upgrade: parsel>=1.5.0 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (1.5.2)
Requirement already satisfied, skipping upgrade: cryptography>=2.0 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (2.6.1)
Requirement already satisfied, skipping upgrade: pyOpenSSL>=16.2.0 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (19.0.0)
Requirement already satisfied, skipping upgrade: lxml>=3.5.0 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (4.3.2)
Requirement already satisfied, skipping upgrade: service-identity>=16.0.0 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (18.1.0)
Requirement already satisfied, skipping upgrade: zope.interface>=4.1.3 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (5.1.0)
Requirement already satisfied, skipping upgrade: PyDispatcher>=2.0.5 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (2.0.5)
Requirement already satisfied, skipping upgrade: cssselect>=0.9.1 in c:\users\XXXXX\anaconda3\lib\site-packages (from scrapy) (1.1.0)
Requirement already satisfied, skipping upgrade: six>=1.4.1 in c:\users\XXXXX\anaconda3\lib\site-packages (from w3lib>=1.17.0->scrapy) (1.12.0)
Requirement already satisfied, skipping upgrade: PyHamcrest!=1.10.0,>=1.9.0 in c:\users\XXXXX\anaconda3\lib\site-packages (from Twisted>=17.9.0->scrapy) (2.0.2)
Requirement already satisfied, skipping upgrade: hyperlink>=17.1.1 in c:\users\XXXXX\anaconda3\lib\site-packages (from Twisted>=17.9.0->scrapy) (19.0.0)
Requirement already satisfied, skipping upgrade: constantly>=15.1 in c:\users\XXXXX\anaconda3\lib\site-packages (from Twisted>=17.9.0->scrapy) (15.1.0)
Requirement already satisfied, skipping upgrade: attrs>=19.2.0 in c:\users\XXXXX\anaconda3\lib\site-packages (from Twisted>=17.9.0->scrapy) (19.3.0)
Requirement already satisfied, skipping upgrade: Automat>=0.3.0 in c:\users\XXXXX\anaconda3\lib\site-packages (from Twisted>=17.9.0->scrapy) (20.2.0)
Requirement already satisfied, skipping upgrade: incremental>=16.10.1 in c:\users\XXXXX\anaconda3\lib\site-packages (from Twisted>=17.9.0->scrapy) (17.5.0)
Requirement already satisfied, skipping upgrade: asn1crypto>=0.21.0 in c:\users\XXXXX\anaconda3\lib\site-packages (from cryptography>=2.0->scrapy) (0.24.0)
Requirement already satisfied, skipping upgrade: cffi!=1.11.3,>=1.8 in c:\users\XXXXX\anaconda3\lib\site-packages (from cryptography>=2.0->scrapy) (1.12.2)
Requirement already satisfied, skipping upgrade: pyasn1-modules in c:\users\XXXXX\anaconda3\lib\site-packages (from service-identity>=16.0.0->scrapy) (0.2.7)
Requirement already satisfied, skipping upgrade: pyasn1 in c:\users\XXXXX\anaconda3\lib\site-packages (from service-identity>=16.0.0->scrapy) (0.4.8)
Requirement already satisfied, skipping upgrade: setuptools in c:\users\XXXXX\anaconda3\lib\site-packages (from zope.interface>=4.1.3->scrapy) (40.8.0)
Requirement already satisfied, skipping upgrade: idna>=2.5 in c:\users\XXXXX\anaconda3\lib\site-packages (from hyperlink>=17.1.1->Twisted>=17.9.0->scrapy) (2.8)
Requirement already satisfied, skipping upgrade: pycparser in c:\users\XXXXX\anaconda3\lib\site-packages (from cffi!=1.11.3,>=1.8->cryptography>=2.0->scrapy) (2.19)
C:\Users\XXXXX>python --version
Python 3.7.3
C:\Users\XXXXX>scrapy
Scrapy 2.1.0 - no active project
Usage:
scrapy <command> [options] [args]
Available commands:
bench Run quick benchmark test
fetch Fetch a URL using the Scrapy downloader
genspider Generate new spider using pre-defined templates
runspider Run a self-contained spider (without creating a project)
settings Get settings values
shell Interactive scraping console
startproject Create new project
version Print Scrapy version
view Open URL in browser, as seen by Scrapy
[ more ] More commands available when run from project directory
Use "scrapy <command> -h" to see more info about a command
C:\Users\XXXXX>
如您所见, attrs 版本与 scrapy 相混淆。