1

我正在使用Django/Celery Quickstart ... 或者,我是如何学会停止使用 cron 并喜欢 celery的,而且似乎这些作业正在排队,但从未运行。

任务.py:

from celery.task.schedules import crontab
from celery.decorators import periodic_task

# this will run every minute, see http://celeryproject.org/docs/reference/celery.task.schedules.html#celery.task.schedules.crontab
@periodic_task(run_every=crontab(hour="*", minute="*", day_of_week="*"))
def test():
    print "firing test task"

所以我运行芹菜:

bash-3.2$ sudo manage.py celeryd -v 2 -B -s celery -E -l INFO  

/scratch/software/python/lib/celery/apps/worker.py:166: RuntimeWarning: Running celeryd with superuser privileges is discouraged!
  'Running celeryd with superuser privileges is discouraged!'))

 -------------- celery@myserver v3.0.12 (Chiastic Slide)
---- **** ----- 
--- * ***  * -- [Configuration]
-- * - **** --- . broker:      django://localhost//
- ** ---------- . app:         default:0x12120290 (djcelery.loaders.DjangoLoader)
- ** ---------- . concurrency: 2 (processes)
- ** ---------- . events:      ON
- ** ---------- 
- *** --- * --- [Queues]
-- ******* ---- . celery:      exchange:celery(direct) binding:celery
--- ***** ----- 

[Tasks]
  . GotPatch.tasks.test

[2012-12-12 11:58:37,118: INFO/Beat] Celerybeat: Starting...
[2012-12-12 11:58:37,163: INFO/Beat] Scheduler: Sending due task GotPatch.tasks.test (GotPatch.tasks.test)
[2012-12-12 11:58:37,249: WARNING/MainProcess] /scratch/software/python/lib/djcelery/loaders.py:132: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn("Using settings.DEBUG leads to a memory leak, never "
[2012-12-12 11:58:37,348: WARNING/MainProcess] celery@myserver ready.
[2012-12-12 11:58:37,352: INFO/MainProcess] consumer: Connected to django://localhost//.
[2012-12-12 11:58:37,700: INFO/MainProcess] child process calling self.run()
[2012-12-12 11:58:37,857: INFO/MainProcess] child process calling self.run()
[2012-12-12 11:59:00,229: INFO/Beat] Scheduler: Sending due task GotPatch.tasks.test (GotPatch.tasks.test)
[2012-12-12 12:00:00,017: INFO/Beat] Scheduler: Sending due task GotPatch.tasks.test (GotPatch.tasks.test)
[2012-12-12 12:01:00,020: INFO/Beat] Scheduler: Sending due task GotPatch.tasks.test (GotPatch.tasks.test)
[2012-12-12 12:02:00,024: INFO/Beat] Scheduler: Sending due task GotPatch.tasks.test (GotPatch.tasks.test)

这些任务确实在排队:

python manage.py shell
>>> from kombu.transport.django.models import Message
>>> Message.objects.count()
234

并且计数随着时间的推移而增加:

>>> Message.objects.count()
477

日志文件中似乎没有任何行表明正在执行任务。我期待类似的东西:

[... INFO/MainProcess] Task myapp.tasks.test[39d57f82-fdd2-406a-ad5f-50b0e30a6492] succeeded in 0.00423407554626s: None

有什么建议如何诊断/调试吗?

4

3 回答 3

0

您应该检查您是否在 django 的 settyngs.py 中指定了 BROKER_URL 参数。

BROKER_URL = 'django://'

你应该检查你在 django、mysql 和 celery 中的时区是否相等。它帮助了我。

附言:

[... INFO/MainProcess] Task myapp.tasks.test[39d57f82-fdd2-406a-ad5f-50b0e30a6492] succeeded in 0.00423407554626s: None

此行表示您的任务已安排(!未执行!)

请检查您的配置,希望对您有所帮助。

于 2014-11-25T16:00:25.530 回答
0

我也是 celery 的新手,但从您提供的链接上的评论来看,教程中似乎有错误。其中一条评论指出:

在这个命令

sudo ./manage.py celeryd -v 2 -B -s celery -E -l INFO

您必须添加“-I tasks”才能加载 tasks.py 文件...

你试过吗?

于 2013-01-30T08:42:47.150 回答
0

我希望有人可以从我的黑客经验中学习。

根据教程设置完所有内容后,我注意到当我打电话时

add.delay(4,5)

什么都没发生。工人没有收到任务(stderr 上没有打印任何内容)。

问题出在rabbitmq安装上。事实证明,默认的可用磁盘大小要求是 1GB,这对我的 VM 来说太大了。

让我走上正轨的是阅读 rabbitmq 日志文件。为了找到它,我不得不停止并启动 rabbitmq 服务器

sudo rabbitmqctl stop
sudo rabbitmq-server

rabbitmq 将日志文件位置转储到屏幕上。在文件中我注意到了这一点:

=WARNING REPORT==== 14-Mar-2017::13:57:41 ===
disk resource limit alarm set on node rabbit@supporttip.

**********************************************************
*** Publishers will be blocked until this alarm clears ***
**********************************************************

然后我按照此处的说明来减少可用磁盘限制 Rabbitmq 忽略 Ubuntu 12 上的配置

作为基线,我使用了来自 git https://github.com/rabbitmq/rabbitmq-server/blob/stable/docs/rabbitmq.config.example的配置文件

变化本身:

{disk_free_limit, "50MB"}
于 2017-03-16T14:56:52.660 回答