django中引入celery定时任务
首先安装redi,使用redis充当broker
在settings.py中增加如下配置:
CELERY_RESULT_BACKEND = 'djcelery.backends.database:DatabaseBackend'
CELERY_TIMEZONE = 'Asia/Shanghai'
CELERY_ENABLE_UTC = False
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERYD_MAX_TASKS_PER_CHILD = 10
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
INSTALL_APPS中增加
'djcelery',
在settings.py同级目录增加celery.py
# -*- coding: utf-8 -*-
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from myapp import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myapp.settings')
app = Celery('myapp',
broker='redis://localhost:6379')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
# app.config_from_object('django.conf:settings', namespace='CELERY')
app.config_from_object('django.conf:settings')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
启动
./redis-server &
celery multi start w1 w2 -c 2 --app=myapp--logfile="%n%I.log" --pidfile=/home/app/pid/%n.pid -l info
celery -A myapp --loglevel="INFO" beat -l info --logfile="/home/log/myapp/beat.log" --pidfile=/home/app/pid/beat.pid -S djcelery.schedulers.DatabaseScheduler