从python子进程作业获取输出到龙卷风
我搜索了很多,但没有找到我如何将正在运行的
python子进程的输出转换为Tornado.我想要的是像
Travis CI.在管理页面我将开始工作,服务器将收到请求并启动子进程.此子进程将执行一些数据挖掘并使用一些日志提供字符串缓冲区.我将通过settimeout或websocket获取带有一些ajax的日志,并将此日志输出到页面中.即使用户关闭页面并稍后返回该页面,也会有日志,通常会更新.嗯,真的非常类似于特拉维斯.
解决方法
这篇博文显示了一种方法:
http://stefaanlippens.net/python-asynchronous-subprocess-pipe-reading
本质上,帖子显示了如何通过异步读取stdout和stderr来读取进程的输出时防止死锁.您可以从__main__替换producer命令来运行您喜欢的任何命令和带有代码的print语句来处理Tornado中的输出. 更新:我已经包含以下内容以防博客被删除:
import sys import subprocess import random import time import threading import Queue class AsynchronousFileReader(threading.Thread): ''' Helper class to implement asynchronous reading of a file in a separate thread. Pushes read lines on a queue to be consumed in another thread. ''' def __init__(self,fd,queue): assert isinstance(queue,Queue.Queue) assert callable(fd.readline) threading.Thread.__init__(self) self._fd = fd self._queue = queue def run(self): '''The body of the tread: read lines and put them on the queue.''' for line in iter(self._fd.readline,''): self._queue.put(line) def eof(self): '''Check whether there is no more content to expect.''' return not self.is_alive() and self._queue.empty() def consume(command): ''' Example of how to consume standard output and standard error of a subprocess asynchronously without risk on deadlocking. ''' # Launch the command as subprocess. process = subprocess.Popen(command,stdout=subprocess.PIPE,stderr=subprocess.PIPE) # Launch the asynchronous readers of the process' stdout and stderr. stdout_queue = Queue.Queue() stdout_reader = AsynchronousFileReader(process.stdout,stdout_queue) stdout_reader.start() stderr_queue = Queue.Queue() stderr_reader = AsynchronousFileReader(process.stderr,stderr_queue) stderr_reader.start() # Check the queues if we received some output (until there is nothing more to get). while not stdout_reader.eof() or not stderr_reader.eof(): # Show what we received from standard output. while not stdout_queue.empty(): line = stdout_queue.get() print 'Received line on standard output: ' + repr(line) # Show what we received from standard error. while not stderr_queue.empty(): line = stderr_queue.get() print 'Received line on standard error: ' + repr(line) # Sleep a bit before asking the readers again. time.sleep(.1) # Let's be tidy and join the threads we've started. stdout_reader.join() stderr_reader.join() # Close subprocess' file descriptors. process.stdout.close() process.stderr.close() def produce(items=10): ''' Dummy function to randomly render a couple of lines on standard output and standard error. ''' for i in range(items): output = random.choice([sys.stdout,sys.stderr]) output.write('Line %d on %sn' % (i,output)) output.flush() time.sleep(random.uniform(.1,1)) if __name__ == '__main__': # The main flow: # if there is an command line argument 'produce',act as a producer # otherwise be a consumer (which launches a producer as subprocess). if len(sys.argv) == 2 and sys.argv[1] == 'produce': produce(10) else: consume(['python',sys.argv[0],'produce']) (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |