Python server programming: task queue using django-celery

王林
Release: 2023-06-18 17:09:11
Original
1368 people have browsed it

Python Server Programming: Task Queuing with django-celery

With the increasing popularity of web applications and the increase in the number of users, modern web applications need to stay productive by handling complex and time-sensitive tasks and stability. From order processing on e-commerce websites and processing of system log files to advanced applications of computer vision and natural language processing, these tasks require independent processes to handle.

The conventional approach is to use cron or a similar job scheduler, but there are the following problems:

  • It is difficult to dynamically manage and allocate tasks.
  • Difficulty retrying failed tasks.
  • Cannot easily distribute tasks to multiple servers.
  • The status of jobs and tasks cannot be tracked and monitored.

So, in order to solve these problems, we need a task queue service.

In the Python ecosystem, Celery is the most commonly used task queue. It is a task queue designed for distributed systems and suitable for high-concurrency, high-throughput web applications.

In this article, we will introduce how to develop a task queue service using Celery and Django. We will use Django-Celery as the Django integration for Celery.

  1. Install related dependencies

First, we need to install the dependencies of Celery and Django-Celery into the project. You can use the pip tool to install them.

pip install celery django-celery

  1. Configure Celery

Before we start using Celery, we need to configure Celery. To do this, create a file called celery.py, which should be located in the root directory of your project. The contents of the file are as follows:

from __future__ import absolute_import, unicode_literals

import os

from celery import Celery

# Set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'example.settings')

app = Celery('example')

# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
Copy after login

Note: If you wish to configure Celery with the configuration specified in the settings.py file, replace 'example.settings' with your actual Django project name.

  1. Configuring Django

Now, we need to configure Django in the settings.py file so that it supports Celery.

# Celery Configuration
CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'

CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'

# app注册
INSTALLED_APPS = (
    ...
    'django_celery_results',
    'django_celery_beat',
    ...
)
Copy after login

Here, we configure two key settings. (1) CELERY_BROKER_URL – This tells Celery to use Redis as its middleware service. (2) INSTALLED_APPS – We need to register two applications of Django-Celery in our application.

  1. Create a task

Now that we have configured Celery and Django, we can start defining some tasks. We will create a sample task to demonstrate the task structure and syntax. In the app/tasks.py file, add the following content.

from django.core.mail import send_mail
from celery import shared_task
from datetime import datetime


@shared_task
def send_email_task():
    subject = 'Celery Email Demo'
    message = 'This email is sent using celery!'
    from_email = 'demo@example.com'
    recipient_list = ['recipient@example.com']
    send_mail(subject, message, from_email, recipient_list)

    print('Email Sent Successfully')
    return None


@shared_task
def print_time():
    now = datetime.now()
    current_time = now.strftime("%H:%M:%S")
    print("Current Time =", current_time)

    return None
Copy after login

Here, we define two tasks. They are send_email_task and print_time tasks respectively. Pay attention to this, we decorate the task with the shared_task decorator. This makes our tasks accessible from anywhere, allowing them to be called by multiple processes.

  1. Start Worker Processes

Now that we have defined the tasks, we need to start the worker processes and tell them what tasks to perform.

Open a terminal window and enter the following command:

$ celery -A example worker --loglevel=info
Copy after login

Note that example represents the name of the Django project. Here, we use --loglevel=info to control the worker's log level.

  1. Scheduling tasks through the Django admin interface

Django-Celery supports managing and scheduling tasks in the Django Admin interface. We need to register two applications with Django-Celery. We can add the following code in the admin.py file.

from django.contrib import admin
from django_celery_beat.admin import PeriodicTaskAdmin, IntervalScheduleAdmin
from django_celery_results.models import TaskResult
from django_celery_results.admin import TaskResultAdmin
from core.tasks import send_email_task, print_time


class Tasks(admin.ModelAdmin):
    actions = ['send_email_task', 'print_time']

    def send_email_task(self, request, queryset):
        send_email_task.delay()

    send_email_task.short_description = "Send Email Task"

    def print_time(self, request, queryset):
        print_time.delay()

    print_time.short_description = "Print Current Time"


admin.site.unregister(TaskResult)
admin.site.register(TaskResult, TaskResultAdmin)
admin.site.register(IntervalSchedule, IntervalScheduleAdmin)
admin.site.register(PeriodicTask, PeriodicTaskAdmin)
admin.site.register(Tasks)
Copy after login

Here, we add our tasks to the admin interface. We can perform these tasks by clicking on the "Perform Mail Task" and "Print Current Time" buttons.

Now, we have successfully established a task queue service using Django-Celery. We can use it for different applications and distribute it to multiple servers using communication protocols such as WebSocket and HTTP protocols.

Conclusion

This article introduces how to use Celery and Django to develop a task queue service. We use Django-Celery as a Django integration for Celery and demonstrate how to define tasks, configure Celery, start workers, schedule tasks, and manage tasks. Task queue services provide an excellent way to handle complex and time-consuming tasks and enable better performance and reliability of web applications.

The above is the detailed content of Python server programming: task queue using django-celery. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!