0

我正在尝试从此 url抓取前 50 页:

https://www.realtor.ca/Residential/Map.aspx#CultureId=1&ApplicationId=1&RecordsPerPage=9&MaximumResults=9&PropertySearchTypeId=1&TransactionTypeId=2&StoreyRange=0-0&BedRange=0-0&BathRange=0-0&LongitudeMin=-80.62866210937504&LongitudeMin350=-78.1430Max=-78.1430 43.20517581723729&LatitudeMax=44.1309708567274&SortOrder=A&SortBy=1&viewState=g&Longitude=-79.3858337402344&Latitude=43.6698583295497&ZoomLevel=10&PropertyTypeGroupID=1

我已经设置了splash,我的蜘蛛文件如下:

列表.py

import scrapy
from urls import start_urls
from scrapy.http import FormRequest, Request
from scrapy_splash import SplashRequest

class BbbSpider(scrapy.Spider):
    AUTOTHROTTLE_ENABLED = True
    name = 'list'
    

    def start_requests(self):
        for x in start_urls:
            for i in range(1,50):
                req = x + '&CurrentPage=' + str(i)
                yield SplashRequest(req, self.parse, endpoint='render.html', args={'wait': 0.5})

    def parse(self, response):

        NAME_SELECTOR = '/a/div[@class="m_gallery_lst_cell_sec"]/div[@id="hm_lst_address1"]/text()[1]'
        yield {
        'data': response.body,
        }
        

另一个文件urls.py基本上有起始 URL:

start_urls = ['https://www.realtor.ca/Residential/Map.aspx#CultureId=1&ApplicationId=1&RecordsPerPage=9&MaximumResults=9&PropertySearchTypeId=1&TransactionTypeId=2&StoreyRange=0-0&BedRange=0-0&BathRange=0-0&LongitudeMin=-80.62866210937504&LongitudeMax=-78.14300537109379&LatitudeMin=43.20517581723729&LatitudeMax=44.1309708567274&SortOrder=A&SortBy=1&viewState=g&Longitude=-79.3858337402344&Latitude=43.6698583295497&ZoomLevel=10&PropertyTypeGroupID=1', ]

当我爬行时,这是我从scrapy得到的日志:

2017-06-24 17:06:22 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://www.realtor.ca/Residential/Map.aspx#CultureId
=1&ApplicationId=1&RecordsPerPage=9&MaximumResults=9&PropertySearchTypeId=1&TransactionTypeId=2&StoreyRange=0-0&BedRange
=0-0&BathRange=0-0&LongitudeMin=-80.62866210937504&LongitudeMax=-78.14300537109379&LatitudeMin=43.20517581723729&Latitud
eMax=44.1309708567274&SortOrder=A&SortBy=1&viewState=g&Longitude=-79.3858337402344&Latitude=43.6698583295497&ZoomLevel=1
0&PropertyTypeGroupID=1&CurrentPage=15 via http://192.168.99.100:8050/render.html> (referer: None)
2017-06-24 17:06:22 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.realtor.ca/Residential/Map.aspx#CultureId
=1&ApplicationId=1&RecordsPerPage=9&MaximumResults=9&PropertySearchTypeId=1&TransactionTypeId=2&StoreyRange=0-0&BedRange
=0-0&BathRange=0-0&LongitudeMin=-80.62866210937504&LongitudeMax=-78.14300537109379&LatitudeMin=43.20517581723729&Latitud
eMax=44.1309708567274&SortOrder=A&SortBy=1&viewState=g&Longitude=-79.3858337402344&Latitude=43.6698583295497&ZoomLevel=1
0&PropertyTypeGroupID=1&CurrentPage=12>
{'data': '<!DOCTYPE html><html><head>\n<meta name="ROBOTS" content="NOINDEX, NOFOLLOW">\n<meta http-equiv="cache-control
" content="max-age=0">\n<meta http-equiv="cache-control" content="no-cache">\n<meta http-equiv="expires" content="0">\n<
meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT">\n<meta http-equiv="pragma" content="no-cache">\n<meta
http-equiv="refresh" content="10; url=/distil_r_captcha.html?Ref=/Residential/Map.aspx&amp;distil_RID=5132BCAC-58D1-11E7
-A2E9-911702A49067&amp;distil_TID=20170624113609">\n<script type="text/javascript">\n\t(function(window){\n\t\ttry {\n\t
\t\tif (typeof sessionStorage !== \'undefined\'){\n\t\t\t\tsessionStorage.setItem(\'distil_referrer\', document.referrer
);\n\t\t\t}\n\t\t} catch (e){}\n\t})(window);\n</script>\n<script type="text/javascript" src="/cndnrlsttdstlssxbaquqxaws
wrdyrzbte.js" defer=""></script><style type="text/css">#d__fFH{position:absolute;top:-5000px;left:-5000px}#d__fF{font-fa
mily:serif;font-size:200px;visibility:hidden}#rvwebwba{display:none!important}</style></head>\n<body>\n<div id="distil_i
dent_block">&nbsp;</div>\n\n\n</body></html>'}
2017-06-24 17:06:22 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.realtor.ca/Residential/Map.aspx#CultureId
=1&ApplicationId=1&RecordsPerPage=9&MaximumResults=9&PropertySearchTypeId=1&TransactionTypeId=2&StoreyRange=0-0&BedRange
=0-0&BathRange=0-0&LongitudeMin=-80.62866210937504&LongitudeMax=-78.14300537109379&LatitudeMin=43.20517581723729&Latitud
eMax=44.1309708567274&SortOrder=A&SortBy=1&viewState=g&Longitude=-79.3858337402344&Latitude=43.6698583295497&ZoomLevel=1
0&PropertyTypeGroupID=1&CurrentPage=8>
{'data': '<!DOCTYPE html><html><head>\n<meta name="ROBOTS" content="NOINDEX, NOFOLLOW">\n<meta http-equiv="cache-control
" content="max-age=0">\n<meta http-equiv="cache-control" content="no-cache">\n<meta http-equiv="expires" content="0">\n<
meta http-equiv="expires" content="Tue, 01 Jan 1980 1:00:00 GMT">\n<meta http-equiv="pragma" content="no-cache">\n<meta
http-equiv="refresh" content="10; url=/distil_r_captcha.html?Ref=/Residential/Map.aspx&amp;distil_RID=5132BCAC-58D1-11E7
-A2E9-911702A49067&amp;distil_TID=20170624113609">\n<script type="text/javascript">\n\t(function(window){\n\t\ttry {\n\t
\t\tif (typeof sessionStorage !== \'undefined\'){\n\t\t\t\tsessionStorage.setItem(\'distil_referrer\', document.referrer
);\n\t\t\t}\n\t\t} catch (e){}\n\t})(window);\n</script>\n<script type="text/javascript" src="/cndnrlsttdstlssxbaquqxaws
wrdyrzbte.js" defer=""></script><style type="text/css">#d__fFH{position:absolute;top:-5000px;left:-5000px}#d__fF{font-fa
mily:serif;font-size:200px;visibility:hidden}#rvwebwba{display:none!important}</style></head>\n<body>\n<div id="distil_i
dent_block">&nbsp;</div>\n\n\n</body></html>'}
2017-06-24 17:06:22 [scrapy.core.scraper] DEBUG: Scraped from <200 https://www.realtor.ca/Residential/Map.aspx#CultureId
=1&ApplicationId=1&RecordsPerPage=9&MaximumResults=9&PropertySearchTypeId=1&TransactionTypeId=2&StoreyRange=0-0&BedRange
=0-0&BathRange=0-0&LongitudeMin=-80.62866210937504&LongitudeMax=-78.14300537109379&LatitudeMin=43.20517581723729&Latitud
eMax=44.1309708567274&SortOrder=A&SortBy=1&viewState=g&Longitude=-79.3858337402344&Latitude=43.6698583295497&ZoomLevel=1
0&PropertyTypeGroupID=1&CurrentPage=10>

如果这是一个愚蠢的问题,我很抱歉,但我是新手,我只做了简单和静态的网络抓取。

请让我知道我做错了什么?

任何帮助深表感谢,

先感谢您。

4

1 回答 1

4

因此,应要求提供一个抓取实际包含信息的页面的示例,而不是抓取调用另一个页面的页面。

首先我检查了网络选项卡(右键单击页面,检查,网络选项卡)并找到它从中获取数据的页面

正如检查所示,它是通过带有一堆数据的 POST 请求调用的。我只是将原始请求中的数据复制并放入示例代码(变量formdata)中。

下面展示了我是如何通过 shell 得到这个页面的响应的。应该可以在您的蜘蛛中实现类似的方法。

>>> formdata = {'LongitudeMin': '-84.08248901367192', 'PropertySearchTypeId': '1', 'LongitudeMax': '-74.68917846679692', 'Latitude': '43.6698583295497', 'GUID': '1e5bb0d7-a12c-4eb6-b37a-b9938c87b6bc', 'TransactionTypeId': '2', 'CultureId': '1', 'BedRange': '0-0', 'MaximumResults': '9', 'Longitude': '-79.3858337402344', 'ZoomLevel': '10', 'Token': 'D6TmfZprLI/lh7oVUS6IVlY0LerZeFUfyVGNICdRgXg=', 'BathRange': '0-0', 'StoreyRange': '0-0', 'SortOrder': 'A', 'RecordsPerPage': '9', 'PropertyTypeGroupID': '1', 'ApplicationId': '1', 'LatitudeMax': '44.629573191950996', 'SortBy': '1', 'LatitudeMin': '42.69454866207688', 'viewState': 'g'}
>>> from scrapy.http import FormRequest
>>> req = FormRequest(url="https://api2.realtor.ca/Listing.svc/PropertySearch_Post", formdata=formdata)
>>> fetch(req)

这会产生一个包含 JSON 的页面。加载此数据将允许您以列表的形式访问数据:

>>> import json
>>> results = json.loads(response.body_as_unicode())['Results']
>>> print results[0]
{u'Building': {u'Ammenities': u'Storage - Locker, Exercise Centre, Party Room', u'Bedrooms': u'2', u'Type': u'Apartment', u'BathroomTotal': u'2'}, u'AlternateURL': {u'VideoLink': u'http://www.myvisuallistings.com/pfsnb/237437'}, u'PublicRemarks': u"Rarely Found Unit With 9 Ft Ceiling & 2 Parking Spots. **S. Steel Appliances**Extended Long Kitchen Cabinets- Lots Of Storage** Gleaming Laminate Flooring & Wide B. Boards** Custom Closet Organizers**Hydro Included In Maintenance**Freshly Painted**Relaxing View Of Water Fountain**State Of The Art Amenities, Theater, 5-Pin Bowling, Virtual Golf, Indoor Pool, Gym, Guest Suites & Bbq. .,24 Hrs Concierge Service & Alarm System..Get It Before It's Gone!! **** EXTRAS **** Stainless Steel Fridge, S.S. Stove, S.S. B/I D/Washer, S.S. Otr Microwave, Washer/Dryer, Designer Ceiling Fans, Crystal Chandelier, Amenities On The Same Floor. Walk To Celebration Square, Library, Square One, Ymca & Living Arts.", u'Business': {}, u'Land': {}, u'StatusId': u'1', u'PhotoChangeDateUTC': u'18/05/2017 2:15:09 PM', u'PostalCode': u'L5B4P5', u'Individual': [{u'CorporationDisplayTypeId': u'0', u'Name': u'MANOJ KUMAR ARORA', u'FirstName': u'MANOJ KUMAR', u'Phones': [{u'AreaCode': u'905', u'PhoneTypeId': u'1', u'PhoneNumber': u'488-3101', u'PhoneType': u'Telephone'}, {u'AreaCode': u'416', u'PhoneTypeId': u'5', u'PhoneNumber': u'618-9753', u'PhoneType': u'Toll Free'}], u'PermitShowListingLink': True, u'LastName': u'ARORA', u'Websites': [{u'Website': u'http://www.AceTeamRealty.com', u'WebsiteTypeId': u'1'}], u'PermitFreetextEmail': True, u'IndividualID': 1501789, u'Position': u'Broker of record', u'Organization': {u'Name': u'ACE TEAM REALTY INC.', u'Designation': u'Brokerage', u'OrganizationID': 220890, u'Phones': [{u'AreaCode': u'905', u'PhoneTypeId': u'1', u'PhoneNumber': u'488-3101', u'PhoneType': u'Telephone'}, {u'AreaCode': u'888', u'PhoneTypeId': u'4', u'PhoneNumber': u'443-3155', u'PhoneType': u'Fax'}], u'PermitShowListingLink': True, u'Websites': [{u'Website': u'http://www.aceteamrealty.com', u'WebsiteTypeId': u'1'}], u'PermitFreetextEmail': True, u'Address': {u'AddressText': u'77 CITY CENTRE DR #501|EAST TOWER|MISSISSAUGA, ON L5B1M5'}, u'HasEmail': True, u'Emails': [{u'ContactId': u'392248908'}]}, u'Emails': [{u'ContactId': u'397620464'}]}], u'RelativeDetailsURL': u'/Residential/Single-Family/18187170/326---3888-DUKE-OF-YORK-Boulevard-Mississauga-Ontario-L5B4P5-City-Centre', u'MlsNumber': u'W3797140', u'Property': {u'TypeId': u'300', u'Photo': [{u'HighResPath': u'https://cdn.realtor.ca/listing/TS636306993099630000/reb82/highres/0/w3797140_1.jpg', u'SequenceId': u'1', u'LowResPath': u'https://cdn.realtor.ca/listing/TS636306993099630000/reb82/lowres/0/w3797140_1.jpg', u'LastUpdated': u'18/05/2017 10:15:09 AM', u'MedResPath': u'https://cdn.realtor.ca/listing/TS636306993099630000/reb82/medres/0/w3797140_1.jpg'}], u'Price': u'$1', u'Address': {u'Latitude': u'43.58744', u'Longitude': u'-79.64139', u'AddressText': u'326 - 3888 DUKE OF YORK Boulevard |Mississauga, Ontario L5B4P5'}, u'Parking': [{u'Name': u'Underground'}], u'ParkingSpaceTotal': u'2', u'AmmenitiesNearBy': u'Hospital, Park, Public Transit', u'Type': u'Single Family', u'OwnershipType': u'Condominium/Strata'}, u'Id': u'18187170'}

您可以使用这些方法来获取您的数据。它比通过飞溅刮擦要快得多。

于 2017-06-25T11:12:06.900 回答