加入收藏 | 设为首页 | 会员中心 | 我要投稿 李大同 (https://www.lidatong.com.cn/)- 科技、建站、经验、云计算、5G、大数据,站长网!
当前位置: 首页 > 编程开发 > Python > 正文

python – scrapy新手:教程.运行scrapy crawl dmoz时出错

发布时间:2020-12-20 12:20:05 所属栏目:Python 来源:网络整理
导读:我已经设置了PATH变量,我认为我正在配置一切正确.但是当我在startproject文件夹中运行“scrapy crawl dmoz”时,我收到以下错误消息: c:matttestingdmozscrapy crawl dmoz2012-04-24 18:12:56-0400 [scrapy] INFO: Scrapy 0.14.0.2841 started (bot: dmoz
我已经设置了PATH变量,我认为我正在配置一切正确.但是当我在startproject文件夹中运行“scrapy crawl dmoz”时,我收到以下错误消息:

c:matttestingdmoz>scrapy crawl dmoz
2012-04-24 18:12:56-0400 [scrapy] INFO: Scrapy 0.14.0.2841 started (bot: dmoz)
2012-04-24 18:12:56-0400 [scrapy] DEBUG: Enabled extensions: LogStats,TelnetConsole,CloseSpider,WebService,CoreStats,SpiderState
2012-04-24 18:12:56-0400 [scrapy] DEBUG: Enabled downloader middlewares:    
HttpAuthMiddleware,DownloadTimeoutMiddleware,UserAgentMiddleware,RetryMiddleware,faultHeadersMiddleware,RedirectMiddleware,CookiesMiddleware,HttpCompressionMiddleware,ChunkedTransferMiddleware,DownloaderStats
2012-04-24 18:12:56-0400 [scrapy] DEBUG: Enabled spider middlewares:   
HttpErrorMiddleware,OffsiteMiddleware,RefererMiddleware,UrlLengthMiddleware,DepthMiddware
2012-04-24 18:12:56-0400 [scrapy] DEBUG: Enabled item pipelines:
Traceback (most recent call last):
File "c:Python27Scriptsscrapy",line 4,in <module>
execute()
File "c:Python27libsite-packagesscrapy-0.14.0.2841-py2.7- 
win32.eggscrapycmdline.py",line 132,in execute
_run_print_help(parser,_run_command,cmd,args,opts)
File "c:Python27libsite-packagesscrapy-0.14.0.2841-py2.7-
win32.eggscrapycmdline.py",line 97,in _run_print_help
func(*a,**kw)
File "c:Python27libsite-packagesscrapy-0.14.0.2841-py2.7-
win32.eggscrapycmdline.py",line 139,in _run_command
cmd.run(args,opts)
File "c:Python27libsite-packagesscrapy-0.14.0.2841-py2.7-
win32.eggscrapycommandscrawl.py",line 43,in run
spider = self.crawler.spiders.create(spname,**opts.spargs)
File "c:Python27libsite-packagesscrapy-0.14.0.2841-py2.7-  
win32.eggscrapyspidermanager.py",in create
raise KeyError("Spider not found: %s" % spider_name)
KeyError: 'Spider not found: dmoz'

有谁知道可能会发生什么?

解决方法

我也有这个问题.

这是因为scrapy教程要求你将你创建的蜘蛛放在/ dmoz / spiders /中,但scrapy正在查看tutorial / tutorial / spiders.

将dmoz_spider.py保存在tutorial / tutorial / spiders中,爬行应该可以正常工作.

(编辑:李大同)

【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容!

    推荐文章
      热点阅读