对于运行 celery 任务的 Django 项目,我使用以下命令运行 celery Flower:python manage.py celery flower --address=0.0.0.0 --port=5555
服务器正确启动,但我收到警告:
[W 161223 17:18:02 control:36] /home/myuser/myenv/myproj/local/lib/python2.7/site-packages/celery/app/control.py:36: DuplicateNodenameWarning: Received multiple replies from node name: 'names'.
看看那个警告的最后。它说我的节点名为names
. 我不明白。我正在通过主管运行 celery 任务,并且没有一个节点被命名为names
.
这是怎么回事?我该如何诊断这个问题?我本质上是在尝试解决此警告消息并让它消失。
/etc/default/celeryd
ps两者都没有提及
这是supervisord.conf
:
; supervisor config file
[unix_http_server]
file=/var/run/supervisor.sock ; (the path to the socket file)
chmod=0700 ; sockef file mode (default 0700)
[supervisord]
logfile=/var/log/supervisor/supervisord.log ; (main log file;default $CWD/supervisord.log)
pidfile=/var/run/supervisord.pid ; (supervisord pidfile;default supervisord.pid)
childlogdir=/var/log/supervisor ; ('AUTO' child log dir, default $TEMP)
environment=ON_AZURE="1"
; the below section must remain in the config file for RPC
; (supervisorctl/web interface) to work, additional interfaces may be
; added by defining them in separate rpcinterface: sections
[rpcinterface:supervisor]
supervisor.rpcinterface_factory = supervisor.rpcinterface:make_main_rpcinterface
[supervisorctl]
serverurl=unix:///var/run/supervisor.sock ; use a unix:// URL for a unix socket
; The [include] section can just contain the "files" setting. This
; setting can list multiple files (separated by whitespace or
; newlines). It can also contain wildcards. The filenames are
; interpreted as relative to this file. Included files *cannot*
; include files themselves.
[include]
files = /etc/supervisor/conf.d/*.conf
celery.conf
是:
[program:celeryworker1]
command=python manage.py celery worker -l info -n celeryworker1
directory = /home/myuser/myproject
environment=PATH="/home/myuser/myenvs/projenv/bin",VIRTUAL_ENV="/home/myuser/myenvs/projenv",PYTHONPATH="/home/myuser/myenvs/projenv/lib/python2.7/site-packages"
user=myuser
password=mypassword
process_name=%(program_name)s%(process_num)d@%(host_node_name)s
numprocs=4
stdout_logfile = /etc/supervisor/logs/celery-worker.log
stderr_logfile = /etc/supervisor/logs/celery-worker.log
autostart = true
autorestart = true
startsecs=10
stopwaitsecs = 600
killasgroup = true
priority = 998