加入收藏 | 设为首页 | 会员中心 | 我要投稿 李大同 (https://www.lidatong.com.cn/)- 科技、建站、经验、云计算、5G、大数据,站长网!
当前位置: 首页 > 编程开发 > Python > 正文

Python实现在线程里运行scrapy的方法

发布时间:2020-12-16 19:51:16 所属栏目:Python 来源:网络整理
导读:本篇章节讲解Python实现在线程里运行scrapy的方法。供大家参考研究。具体如下: 如果你希望在一个写好的程序里调用scrapy,就可以通过下面的代码,让scrapy运行在一个线程里。 """Code to run Scrapy crawler in a thread - works on Scrapy 0.8"""i

本篇章节讲解Python实现在线程里运行scrapy的方法。分享给大家供大家参考。具体如下:

如果你希望在一个写好的程序里调用scrapy,就可以通过下面的代码,让scrapy运行在一个线程里。

"""
Code to run Scrapy crawler in a thread - works on Scrapy 0.8
"""
import threading,Queue
from twisted.internet import reactor
from scrapy.xlib.pydispatch import dispatcher
from scrapy.core.manager import scrapymanager
from scrapy.core.engine import scrapyengine
from scrapy.core import signals
class CrawlerThread(threading.Thread):
  def __init__(self):
    threading.Thread.__init__(self)
    self.running = False
  def run(self):
    self.running = True
    scrapymanager.configure(control_reactor=False)
    scrapymanager.start()
    reactor.run(installSignalHandlers=False)
  def crawl(self,*args):
    if not self.running:
      raise RuntimeError("CrawlerThread not running")
    self._call_and_block_until_signal(signals.spider_closed,

      scrapymanager.crawl,*args)
  def stop(self):
    reactor.callFromThread(scrapyengine.stop)
  def _call_and_block_until_signal(self,signal,f,*a,**kw):
    q = Queue.Queue()
    def unblock():
      q.put(None)
    dispatcher.connect(unblock,signal=signal)
    reactor.callFromThread(f,**kw)
    q.get()
# Usage example below:
 
import os
os.environ.setdefault('SCRAPY_SETTINGS_MODULE','myproject.settings')
from scrapy.xlib.pydispatch import dispatcher
from scrapy.core import signals
from scrapy.conf import settings
from scrapy.crawler import CrawlerThread
settings.overrides['LOG_ENABLED'] = False # avoid log noise
def item_passed(item):
  print "Just scraped item:",item
dispatcher.connect(item_passed,signal=signals.item_passed)
crawler = CrawlerThread()
print "Starting crawler thread..."
crawler.start()
print "Crawling somedomain.com...."
crawler.crawl('somedomain.com) # blocking call
print "Crawling anotherdomain.com..."
crawler.crawl('anotherdomain.com') # blocking call
print "Stopping crawler thread..."
crawler.stop()

希望本文所述对大家的Python程序设计有所帮助。

(编辑:李大同)

【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容!

    推荐文章
      热点阅读