python – Celery / Django单个任务正在运行多次
我正面临一个问题,我将一个任务放入队列,并且它正在运行多次.
从芹菜日志中我可以看到同一个工人正在执行任务…… [2014-06-06 15:12:20,731: INFO/MainProcess] Received task: input.tasks.add_queue [2014-06-06 15:12:20,750: INFO/Worker-2] starting runner.. [2014-06-06 15:12:20,759: INFO/Worker-2] collection started [2014-06-06 15:13:32,828: INFO/Worker-2] collection complete [2014-06-06 15:13:32,836: INFO/Worker-2] generation of steps complete [2014-06-06 15:13:32,836: INFO/Worker-2] update created [2014-06-06 15:13:33,655: INFO/Worker-2] email sent [2014-06-06 15:13:33,656: INFO/Worker-2] update created [2014-06-06 15:13:34,420: INFO/Worker-2] email sent [2014-06-06 15:13:34,421: INFO/Worker-2] FINISH - Success 但是当我查看应用程序的实际日志时,每个步骤显示5-6个日志行(??). 我使用Django 1.6与RabbitMQ.放入队列的方法是通过在函数上放置延迟. 这个函数(添加了任务装饰器(然后调用一个运行的类). 有没有人知道解决这个问题的最佳方法? 编辑:根据要求继承代码, views.py 在我看来,我通过…将数据发送到队列 from input.tasks import add_queue_project add_queue_project.delay(data) tasks.py from celery.decorators import task @task() def add_queue_project(data): """ run project """ logger = logging_setup(app="project") logger.info("starting project runner..") f = project_runner(data) f.main() class project_runner(): """ main project runner """ def __init__(self,data): self.data = data self.logger = logging_setup(app="project") def self.main(self): .... Code settings.py THIRD_PARTY_APPS = ( 'south',# Database migration helpers: 'crispy_forms',# Form layouts 'rest_framework','djcelery',) import djcelery djcelery.setup_loader() BROKER_HOST = "127.0.0.1" BROKER_PORT = 5672 # default RabbitMQ listening port BROKER_USER = "test" BROKER_PASSWORD = "test" BROKER_VHOST = "test" CELERY_BACKEND = "amqp" # telling Celery to report the results back to RabbitMQ CELERY_RESULT_DBURI = "" CELERY_IMPORTS = ("input.tasks",) celeryd 我正在运行的线是开始芹菜, python2.7 manage.py celeryd -l info 谢谢, 解决方法
我没有给你一个确切的答案,但你应该研究一些事情:
> djcelery已被弃用,因此如果您使用新版本的芹菜,可能会出现某种冲突. 看看你的设置,看起来你正在使用旧版本的芹菜,或者你正在使用新配置的新版芹菜.在任何情况下,请确保您拥有最新版本并尝试此配置而不是您拥有的: BROKER_URL = 'amqp://<user>:<password>@localhost:5672/<vhost>' CELERY_RESULT_BACKEND = 'amqp' CELERY_ACCEPT_CONTENT = ['json'] CELERY_TASK_SERIALIZER = 'json' CELERY_RESULT_SERIALIZER = 'json' 现在,您还必须以不同方式配置芹菜: 完全摆脱djcelery的东西. 在你的django项目中创建proj / celery.py: from __future__ import absolute_import import os from celery import Celery from django.conf import settings # set the default Django settings module for the 'celery' program. os.environ.setdefault('DJANGO_SETTINGS_MODULE','settings') app = Celery('proj') # Using a string here means the worker will not have to # pickle the object when using Windows. app.config_from_object('django.conf:settings') app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) @app.task(bind=True) def debug_task(self): print('Request: {0!r}'.format(self.request)) 在你的proj / __ init__.py中: from __future__ import absolute_import from proj.celery import app as celery_app 然后,如果您的输入应用程序是可重用的应用程序并且不属于您的项目,请使用@shared_task而不是@task装饰器. 然后运行芹菜: celery -A proj worker -l info 希望能帮助到你. (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |