Stack technique chez www.ideo-lab.com
Vue d’ensemble compacte et imprimable. Focus sur les traitements asynchrones (Celery + Redis) et l’outillage d’observabilité.
Configuration Django + Celery (extraits)
# settings.py
CELERY_BROKER_URL = env("CELERY_BROKER_URL", default="redis://127.0.0.1:6379/1")
CELERY_RESULT_BACKEND = env("CELERY_RESULT_BACKEND", default="redis://127.0.0.1:6379/2") # ou ""/None si inutile
CELERY_TASK_SERIALIZER = "json"
CELERY_ACCEPT_CONTENT = ["json"]
CELERY_RESULT_SERIALIZER = "json"
CELERY_TIMEZONE = "Europe/Paris"
CELERY_TASK_ACKS_LATE = True
CELERY_TASK_REJECT_ON_WORKER_LOST = True
CELERY_TASK_ALWAYS_EAGER = False # True en dev si besoin
CELERY_TASK_DEFAULT_QUEUE = "default"
CELERY_TASK_ROUTES = {
"app.tasks.envoi_email": {"queue": "high"},
"app.tasks.rapport_pdf": {"queue": "default"},
}
CELERY_BEAT_SCHEDULE = {
"cleanup-results-hourly": {
"task": "app.tasks.cleanup_results",
"schedule": 3600,
},
}
# projet/celery.py
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "projet.settings")
app = Celery("projet")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print(f"Request: {self.request!r}")
# app/tasks.py
from celery import shared_task
@shared_task(bind=True, max_retries=5, default_retry_delay=60, retry_backoff=2, retry_jitter=True, acks_late=True)
def envoi_email(self, destinataire_id: int):
try:
# ... envoi via AWS SES ...
return {"status": "ok"}
except Exception as exc:
raise self.retry(exc=exc)
@shared_task
def cleanup_results():
# Purge résultats/artefacts obsolètes (TTL)
pass
Commandes de lancement & monitoring
# Worker (3 queues) celery -A projet worker -l info -Q high,default,low -O fair -n worker1@%h --concurrency=4 # Planificateur celery -A projet beat -l info # Monitoring celery -A projet inspect ping flower -A projet --port=5555
docker-compose (extrait)
services:
redis:
image: redis:7
ports: ["6379:6379"]
worker:
build: .
command: celery -A projet worker -l info -Q high,default,low -O fair
depends_on: [redis, web]
beat:
build: .
command: celery -A projet beat -l info
depends_on: [redis, web]
flower:
image: mher/flower
command: ["flower", "-A", "projet", "--port=5555"]
ports: ["5555:5555"]
depends_on: [redis, worker]
Astuce : pour EC2 sans compose, créer des units systemd séparées (web/gunicorn, worker, beat, flower) avec WantedBy=multi-user.target et Restart=on-failure.
IDEO-LAB - 