为了账号安全,请及时绑定邮箱和手机立即绑定

scrapy crawl douban_spider报错

STXT_OBEY': True, 'SPIDER_MODULES': ['douban.spiders']}

Traceback (most recent call last):

  File "c:\users\12554\appdata\local\programs\python\python37\lib\runpy.py", line 193, in _run_module_as_main

    "__main__", mod_spec)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\runpy.py", line 85, in _run_code

    exec(code, run_globals)

  File "C:\Users\12554\AppData\Local\Programs\Python\Python37\Scripts\scrapy.exe\__main__.py", line 9, in <module>

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\cmdline.py", line 150, in execute

    _run_print_help(parser, _run_command, cmd, args, opts)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\cmdline.py", line 90, in _run_print_help

    func(*a, **kw)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\cmdline.py", line 157, in _run_command

    cmd.run(args, opts)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\commands\crawl.py", line 57, in run

    self.crawler_process.crawl(spname, **opts.spargs)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\crawler.py", line 170, in crawl

    crawler = self.create_crawler(crawler_or_spidercls)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\crawler.py", line 198, in create_crawler

    return self._create_crawler(crawler_or_spidercls)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\crawler.py", line 203, in _create_crawler

    return Crawler(spidercls, self.settings)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\crawler.py", line 55, in __init__

    self.extensions = ExtensionManager.from_crawler(self)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\middleware.py", line 58, in from_crawler

    return cls.from_settings(crawler.settings, crawler)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\middleware.py", line 34, in from_settings

    mwcls = load_object(clspath)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\utils\misc.py", line 44, in load_object

    mod = import_module(module)

  File "c:\users\12554\appdata\local\programs\python\python37\lib\importlib\__init__.py", line 127, in import_module

    return _bootstrap._gcd_import(name[level:], package, level)

  File "<frozen importlib._bootstrap>", line 1006, in _gcd_import

  File "<frozen importlib._bootstrap>", line 983, in _find_and_load

  File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked

  File "<frozen importlib._bootstrap>", line 677, in _load_unlocked

  File "<frozen importlib._bootstrap_external>", line 728, in exec_module

  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\scrapy\extensions\telnet.py", line 12, in <module>

    from twisted.conch import manhole, telnet

  File "c:\users\12554\appdata\local\programs\python\python37\lib\site-packages\twisted\conch\manhole.py", line 154

    def write(self, data, async=False):

                              ^

SyntaxError: invalid syntax


正在回答

4 回答

self

0 回复 有任何疑惑可以回复我~
#1

crazy灬小曾 提问者

错误我已经找到了,是我自己粗心打错了,谢谢
2018-07-21 回复 有任何疑惑可以回复我~
#2

灰灰grey

回复 慕粉2303506572你改了哪里,运行正确了吗
2018-07-26 回复 有任何疑惑可以回复我~

请问怎么改呀

0 回复 有任何疑惑可以回复我~

同问呀,到底是哪里出错了 ,要怎么改

0 回复 有任何疑惑可以回复我~

slef.data


0 回复 有任何疑惑可以回复我~

举报

0/150
提交
取消

scrapy crawl douban_spider报错

我要回答 关注问题
意见反馈 帮助中心 APP下载
官方微信