Simplifying Asynchronous Task Execution with Celery in Python
Setting up the Celery Application
First, we need to set up our Celery application. This involves specifying the message broker and defining tasks. A message broker is a mechanism responsible for transferring data between the application and Celery workers. In our example, we use RabbitMQ as the broker.
Here is the code snippet for setting up a Celery application, saved in a file named tasks.py
:
from celery import Celery # Create a Celery instance app = Celery("tasks", broker='amqp://username:password@localhost') # Define a simple task to add two numbers @app.task def add(x, y): return x + y
In this setup, Celery
is initialized with a name (“tasks”) and a broker URL, which includes the username, password, and server location (in this case, localhost
for local development).
Defining a Task
We define a simple task using the @app.task
decorator. This task, add
, takes two parameters, x
and y
, and returns their sum. The decorator marks this function as a task that Celery can manage.
Calling the Task Asynchronously
To call our add
task asynchronously, we use the following code snippet in main.py
:
from tasks import add # Call the add task asynchronously result = add.delay(1, 2) print("Task sent to the Celery worker!")
The delay
method is a convenient shortcut provided by Celery to execute the task asynchronously. When add.delay(1, 2)
is called, Celery sends this task to the queue and then it’s picked up by a worker.
Running Celery Workers
To execute the tasks in the queue, we need to run Celery workers. Assuming you’ve activated a virtual environment, you can start a Celery worker using the following command:
.venv\Scripts\celery.exe -A tasks worker --loglevel=info
This command starts a Celery worker with a log level of info
, which provides a moderate amount of logging output. Here, -A tasks
tells Celery that our application is defined in the tasks.py
file.