使用Python多处理在worker之间共享变量
发布时间:2020-12-20 13:43:13 所属栏目:Python 来源:网络整理
导读:参见英文答案 Python multiprocessing and a shared counter????????????????????????????????????3个 如何读取和更新Python中多个工作程序之间共享的变量? 例如,我正在使用Python中的多个进程扫描文件列表,并希望检查父目录是否已被扫描. def readFile(fil
|
参见英文答案 >
Python multiprocessing and a shared counter????????????????????????????????????3个
如何读取和更新Python中多个工作程序之间共享的变量? 例如,我正在使用Python中的多个进程扫描文件列表,并希望检查父目录是否已被扫描. def readFile(filename):
""" Add the parent folder to the database and process the file
"""
path_parts = os.path.split(filename)
dirname = os.path.basename(path_parts[0])
if dirname not in shared_variable:
# Insert into the database
#Other file functions
def main():
""" Walk through files and pass each file to readFile()
"""
queue = multiprocessing.Queue()
pool = multiprocessing.Pool(None,init,[queue])
for dirpath,dirnames,filenames in os.walk(PATH):
full_path_fnames = map(lambda fn: os.path.join(dirpath,fn),filenames)
pool.map(readFile,full_path_fnames)
解决方法
您可以使用multiprocessing.Manager来帮助解决这个问题.它允许您创建可在进程之间共享的列表:
from functools import partial
import multiprocessing
def readFile(shared_variable,filename):
""" Add the parent folder to the database and process the file
"""
path_parts = os.path.split(filename)
dirname = os.path.basename(path_parts[0])
if dirname not in shared_variable:
# Insert into the database
#Other file functions
def main():
""" Walk through files and pass each file to readFile()
"""
manager = multiprocessing.Manager()
shared_variable = manager.list()
queue = multiprocessing.Queue()
pool = multiprocessing.Pool(None,[queue])
func = partial(readFile,shared_variable)
for dirpath,filenames)
pool.map(func,full_path_fnames)
partial只是用来使shared_variable更容易传递给readFile的每次调用,以及full_path_fnames的每个成员通过map传递. (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |
