Django Automation 2026-04-10

How to automate Chess Daily Puzzles with Celery

Automation is one of the most powerful tools in a developer's arsenal. In this post, we'll explore how to automate a daily chess puzzle delivery using Celery, Celery Beat, and Redis. We'll base our example on a real task that fetches the daily puzzle from Chess.com and sends it via email.

The Architecture

To run periodic tasks in a Django application, we need three main components working together:

  • Redis (The Broker): Acts as a mailbox where tasks are stored before being processed.
  • Celery Beat (The Scheduler): A scheduler that kicks off tasks at regular intervals. It sends a message to the broker when it's time to run a task.
  • Celery Worker (The Consumer): The actual process that takes tasks from the broker and executes them.
Mastering Celery Beat
Setup & Configuration

First, we need to configure Celery in our project. Here is how our core/celery.py looks like:


import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "core.settings")

app = Celery("core")

# Using Redis as the message broker
app.conf.broker_url = "redis://localhost:6379/0"
app.conf.result_backend = "redis://localhost:6379/0"

app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
              

The Chess Daily Puzzle Task

The task task_chess_daily_puzzle is responsible for fetching data from the Chess.com API, processing the puzzle information, and sending a beautifully formatted HTML email with the puzzle image and its solution.

We use a constant DAILY_PUZZLE_URL which points to the Chess.com public API for puzzles:

DAILY_PUZZLE_URL = "https://api.chess.com/pub/puzzle"

@app.task
def task_chess_daily_puzzle():
    json_ld_context = requests.get("https://api.chess.com/context/DailyPuzzle.jsonld")
    page = requests.get(DAILY_PUZZLE_URL, headers=json_ld_context)
    puzzle_data = page.json()
    puzzle_time = datetime.fromtimestamp(puzzle_data["publish_time"]).strftime(
        "%Y-%m-%d,  %H:%M:%S"
    )
    image_link = puzzle_data["image"]
    image_data = requests.get(image_link).content
    puzzle_title = puzzle_data["title"]
    image_name = f"{puzzle_time.split(',')[0]}.png"

    solution = puzzle_data["pgn"].split("\n")[-1].strip()

    subject = "Chess.com daily puzzle"
    message = f"Your puzzle is: '{puzzle_title}'. \n\n"
    message += f"Date and time of puzzle is: '{puzzle_time}'. \n\n"
    message += f"Puzzle image is available at <a href='{image_link}'>Daily Puzzle</a> link. \n\n\n"
    message += f"Solution for this puzzle: '{solution}'. \n\n"

    html_message = f"""
    <html>
        <body>
            <p>Your puzzle is: '<b></b>'</p>
            <p>Date and time of puzzle is: ''</p>
            <p>Puzzle image:</p>
            <img src="cid:" alt="Daily Puzzle">
            <p>Puzzle image is also available at <a href="">Daily Puzzle</a> link.</p>
            <p>Solution for this puzzle: '||||||'</p>
        </body>
    </html>
    """

    email_message = EmailMultiAlternatives(
        subject,
        message,
        "from@email-sender.net",
        ["user1@email-receiver.net"],
    )
    email_message.attach_alternative(html_message, "text/html")

    msg_image = MIMEImage(image_data)
    msg_image.add_header("Content-ID", f"<>")
    msg_image.add_header("Content-Disposition", "inline", filename=image_name)
    email_message.attach(msg_image)

    email_message.send(fail_silently=False)
              

Scheduling with Celery Beat

To make this task run every day, we use celery.schedules.crontab. We can define this in our settings or directly in our Celery app configuration:


from celery.schedules import crontab

app.conf.beat_schedule = {
    "task_chess_daily_puzzle": {
        "task": "frontend.tasks.task_chess_daily_puzzle",
        "schedule": crontab(hour="05", minute="10"), # Runs every day at 5:10 AM UTC
    },
}
              

Running the system

To see it in action, you need to have Redis running and then start both the worker and the beat scheduler:


# Start Celery Worker
celery -A core worker -l info

# Start Celery Beat
celery -A core beat -l info
                  

Conclusion

By combining Celery and Redis, we can offload time-consuming tasks like API calls and email sending to the background, keeping our web application responsive. Celery Beat adds the layer of automation needed for features like "Daily Puzzles".


Scientific Dev
Scientific Dev
Educator & Developer