Celery, Redis and Django are used together to improve the efficiency of asynchronous task processing
Introduction: In the process of developing web applications, we often encounter some time-consuming tasks that need to be processed. Task. If these tasks are executed directly in the request processing process, it will cause the user to wait for too long, which is extremely unfriendly to the user experience. In order to solve this problem, we can use Celery, Redis and Django to process time-consuming tasks asynchronously, improving system performance and user experience.
Celery introduction and installation
Celery is a task queue that works based on distributed messaging and also supports task scheduling. Installing Celery can be completed through the pip command:
pip install celery
Django configuration
First you need to add Celery configuration items in the settings.py file of the Django project, as shown below:
# settings.py # Celery配置 CELERY_BROKER_URL = 'redis://localhost:6379/0' CELERY_RESULT_BACKEND = 'redis://localhost:6379/0' CELERY_ACCEPT_CONTENT = ['json'] CELERY_TASK_SERIALIZER = 'json' CELERY_RESULT_SERIALIZER = 'json'
In the above configuration, CELERY_BROKER_URL
and CELERY_RESULT_BACKEND
specify the address and port of Redis, which is used as the backend for task queue and result storage.
Next, add Celery’s configuration in the project’s urls.py file as follows:
# urls.py from django.urls import path from .views import AsyncTaskView urlpatterns = [ path('async-task/', AsyncTaskView.as_view(), name='async_task'), ]
Create task function
Create the tasks.py file in the Django app and define the asynchronous task function in it. Here is a sample code:
# app/tasks.py from celery import shared_task import time @shared_task def process_task(): # 模拟任务处理过程(等待5秒) time.sleep(5) return 'Task completed'
In the above code, the @shared_task
decorator is used to convert the function into a Celery task function.
View implementation
Define a view class in Django's views.py file to receive requests and call asynchronous task functions. The following is a sample code:
# app/views.py from django.views import View from .tasks import process_task from django.http import HttpResponse class AsyncTaskView(View): def get(self, request): # 调用异步任务 task = process_task.delay() return HttpResponse('Task started')
Start the Celery service
Use the following command to start the Celery worker process:
celery -A your_project_name worker --loglevel=info
Note that your_project_name
Replace with your Django project name.
http://localhost:8000/async-task/
in the browser, if everything is normal, you will see the return result is' Task started'. At this time, the task has been processed asynchronously in the background and will not block the user's request processing. Conclusion: By using the combination of Celery, Redis and Django, we can process time-consuming tasks asynchronously, improving system performance and user experience. Task queues and task scheduling can be easily managed using Celery, while Redis as the backend storage enables reliable storage of task data. This solution can be widely used in Web application development, and its implementation process is demonstrated through specific code examples.
The above is the detailed content of Celery, Redis and Django are used together to improve the efficiency of asynchronous task processing. For more information, please follow other related articles on the PHP Chinese website!