4

I need to update the solr index on a schedule with the command:

(env)$ ./manage.py update_index

I've looked through the Celery docs and found info on scheduling, but haven't been able to find a way to run a django management command on a schedule and inside a virtualenv. Would this be better run on a normal cron? And if so how would I run it inside the virtualenv? Anyone have experience with this?

Thanks for the help!

4

3 回答 3

11

Django Celery 任务调度 项目结构

[appname]/
├── [appname]/
│   ├── __init__.py
│   ├── settings.py
│   ├── urls.py
│   ├── celery.py
│   └── wsgi.py
├── [project1]/
│   ├── __init__.py
│   ├── tasks.py
│   
└── manage.py

在settings.py文件中添加以下配置:

STATIC_URL = '/static/'
BROKER_URL = 'redis://localhost:6379/0'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_RESULT_BACKEND = 'redis'
from celery.schedules import crontab
CELERY_TIMEZONE = 'UTC'

celery.py : 保存 celery 任务调度器

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
import django
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'appname.settings')
from django.conf import settings
app = Celery('appname')
app.config_from_object('django.conf:settings')
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))
**#scheduler**
app.conf.beat_schedule = {
    'add-every-30-seconds': {
        'task': 'project1.tasks.cleanup',
        'schedule': 30.0,
        'args': ()
    },
}
app.conf.timezone = 'UTC'

初始化文件

from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ['celery_app']

来自project1的tasks.py

from celery import shared_task
import celery
import time
from django.core import management
@celery.task#(name='cleanup')
def cleanup():
    try:
        print ("in celery module")
        """Cleanup expired sessions by using Django management command."""
        management.call_command("clearsessions", verbosity=0)
        #PUT MANAGEMENT COMMAND HERE
        return "success"
    except:
        print(e)

任务将每 30 秒运行一次

窗户要求:

  1. redis 服务器应该正在运行
  2. celery worker 和 celery beat 应该在不同的终端上运行以下命令

    celery -A appname worker -l info

    celery -A appname beat -l info

Linux的要求:

  1. redis 服务器应该正在运行
  2. celery worker 和 celery beat 应该运行 celery beat 并且 worker 可以在同一台服务器上启动

    celery -A appname worker -l info -B

@tzenderman 如果我错过了什么,请告诉我。对我来说,这工作正常

于 2018-07-19T18:28:51.037 回答
1

要从 cron 作业定期运行命令,只需将命令包装在加载 virtualenv 的 bash 脚本中。例如,这是我们运行 manage.py 命令的方法:

django_cmd.sh:

#!/bin/bash

cd /var/www/website/
source venv/bin/activate
/var/www/website/manage.py $1 --settings=$2

crontab:

MAILTO=webmaster@website.com
SETTINGSMODULE=website.settings_prod
5 * * * * /var/www/website/django_cmd.sh update_index $SETTINGSMODULE >> /dev/null
0 10 * * * /var/www/website/django_cmd.sh update_accounts $SETTINGSMODULE 
于 2013-07-16T22:42:14.253 回答
-1

实际上,我找到了一种使用织物+芹菜的好方法,我现在正在研究它:

在 app/tasks.py 中,使用你需要的 manage.py 命令创建一个 fabric 函数,然后用 装饰它@periodic_task,将它添加到你的 celery 日程表中,这样就可以了。

更新:我无法实际使用 Fabric + Celery,因为在模块中使用 fabric 导致它被识别为结构文件,并且文件中的 celery 调用不起作用。

于 2013-07-17T15:48:04.063 回答