0

我正在使用scrapy抓取一个网站的外部链接,并将这些链接存储到MYSQl db。我在我的代码中使用了一个片段。当我运行蜘蛛时,我看到链接被废弃但给出错误

2018-03-07 13:33:27 [scrapy.log] 错误:字符串格式化期间并非所有参数都转换

很明显,由于点、斜线、逗号和破折号,链接没有被转换为字符串。那么我怎样才能传递链接并在没有错误的情况下存储它们。TIA

管道.py

from scrapy import log
from twisted.enterprise import adbapi
import MySQLdb.cursors


class MySQLStorePipeline(object):

def __init__(self):
    self.dbpool = adbapi.ConnectionPool('MySQLdb', db='usalogic_testdb',
            user='root', passwd='1234', cursorclass=MySQLdb.cursors.DictCursor,
            charset='utf8', use_unicode=True)

def process_item(self, item, spider):
    # run db query in thread pool
    query = self.dbpool.runInteraction(self._conditional_insert, item)
    query.addErrback(self.handle_error)

    return item

def _conditional_insert(self, tx, item):
    # create record if doesn't exist. 
    # all this block run on it's own thread
    tx.execute("select * from test where link = %s", (item['link'], ))
    result = tx.fetchone()
    if result:
        log.msg("Item already stored in db: %s" % item, level=log.DEBUG)
    else:
        tx.execute(\
            "insert into test (link) "
            "values (%s)",
            (item['link'])
        )
        log.msg("Item stored in db: %s" % item, level=log.DEBUG)

def handle_error(self, e):
    log.err(e)

当给出运行命令时, ITEMS.py

class CollectUrlItem(scrapy.Item):
link = scrapy.Field()

设置.py

ITEM_PIPELINES = {

'rvca4.pipelines.MySQLStorePipeline': 800,
}
4

1 回答 1

1

我认为如果您使用列表而不是元组,它将起作用

tx.execute(\
        "insert into test (link) "
        "values (%s)",
        [ item['link'] ]
    )

或者,在元组中添加逗号

tx.execute(\
        "insert into test (link) "
        "values (%s)",
        (item['link'], )
    )

因为在元组中添加尾随逗号实际上是使其成为元组的原因。参见下文

(1)  # the number 1 (the parentheses are wrapping the expression `1`)
(1,) # a 1-tuple holding a number 1
于 2018-03-07T10:59:10.903 回答