我曾经将我所有的 Flask 应用程序代码和 celery 代码放在一个文件中,并且它与主管配合得很好。但是,它很麻烦,所以我将我的任务拆分到 celery_tasks.py 并且出现了这个问题。
在我的项目目录中,我可以使用以下命令手动启动 celery
celery -A celery_tasks worker --loglevel=INFO
但是,因为这是一个服务器,所以我需要 celery 作为后台守护进程运行。但是当我打电话时它显示以下错误sudo supervisorctl restart celeryd
celeryd: ERROR (abnormal termination)
日志说:
Traceback (most recent call last):
File "/srv/www/learningapi.stanford.edu/peerAPI/peerAPIenv/bin/celery", line 9, in <module>
load_entry_point('celery==3.0.19', 'console_scripts', 'celery')()
File "/srv/www/learningapi.stanford.edu/peerAPI/peerAPIenv/local/lib/python2.7/site-packages/celery/__main__.py", line 14, in main
main()
File "/srv/www/learningapi.stanford.edu/peerAPI/peerAPIenv/local/lib/python2.7/site-packages/celery/bin/celery.py", line 957, in main
cmd.execute_from_commandline(argv)
File "/srv/www/learningapi.stanford.edu/peerAPI/peerAPIenv/local/lib/python2.7/site-packages/celery/bin/celery.py", line 901, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/srv/www/learningapi.stanford.edu/peerAPI/peerAPIenv/local/lib/python2.7/site-packages/celery/bin/base.py", line 185, in execute_from_commandline
argv = self.setup_app_from_commandline(argv)
File "/srv/www/learningapi.stanford.edu/peerAPI/peerAPIenv/local/lib/python2.7/site-packages/celery/bin/base.py", line 300, in setup_app_from_commandline
self.app = self.find_app(app)
File "/srv/www/learningapi.stanford.edu/peerAPI/peerAPIenv/local/lib/python2.7/site-packages/celery/bin/base.py", line 318, in find_app
return sym.celery
AttributeError: 'module' object has no attribute 'celery'
我使用了以下配置。
[program:celeryd]
command = celery -A celery_tasks worker --loglevel=INFO
user=peerapi
numprocs=4
stdout_logfile = <path to log>
stderr_logfile = <path to log>
autostart = true
autorestart = true
environment=PATH="<path to my project>"
startsecs=10
; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 600
; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true
; if rabbitmq is supervised, set its priority higher
; so it starts first
priority=998
我的代码也正确地初始化了 celery
celery = Celery('celery_tasks', broker='amqp://guest:guest@localhost:5672//',
backend='amqp')
celery.config_from_object(celeryconfig)
我的 celeryconfig.py 工作正常
CELERY_TASK_SERIALIZER='json'
CELERY_RESULT_SERIALIZER='json'
CELERY_TIMEZONE='America/Los Angeles'
CELERY_ENABLE_UTC=True
有什么线索吗?