Back to articles
Backend

Scaling Django Applications with Celery

Implementing asynchronous task queues, load balancing, and background processing for high-traffic Django applications using Celery and Redis.

December 5, 202510 min read

When You Need Celery

Your Django app is working fine until:

  • Users complain about slow responses
  • Email sending blocks requests
  • Report generation times out
  • Background jobs pile up
  • That's when Celery becomes essential.

    Setup

    Installation

    pip install celery redis

    Configuration

    # celery.py

    from celery import Celery

    app = Celery('myproject')

    app.config_from_object('django.conf:settings', namespace='CELERY')

    app.autodiscover_tasks()

    Settings

    # settings.py

    CELERY_BROKER_URL = 'redis://localhost:6379/0'

    CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'

    CELERY_ACCEPT_CONTENT = ['json']

    CELERY_TASK_SERIALIZER = 'json'

    Task Patterns

    Basic Task

    @app.task

    def send_welcome_email(user_id):

    user = User.objects.get(id=user_id)

    send_email(user.email, "Welcome!", "...")

    Task with Retry

    @app.task(bind=True, max_retries=3)

    def process_payment(self, order_id):

    try:

    order = Order.objects.get(id=order_id)

    payment_gateway.charge(order)

    except PaymentError as exc:

    raise self.retry(exc=exc, countdown=60)

    Periodic Tasks

    # Using Celery Beat

    app.conf.beat_schedule = {

    'cleanup-every-hour': {

    'task': 'myapp.tasks.cleanup_old_data',

    'schedule': crontab(minute=0),

    },

    }

    Scaling Strategies

    Multiple Workers

    # Start multiple workers

    celery -A myproject worker -c 4 # 4 concurrent workers

    Dedicated Queues

    @app.task(queue='high_priority')

    def urgent_task():

    pass

    @app.task(queue='low_priority')

    def batch_task():

    pass

    # Run workers for specific queues

    celery -A myproject worker -Q high_priority -c 2

    celery -A myproject worker -Q low_priority -c 1

    Rate Limiting

    @app.task(rate_limit='100/m')

    def api_call_task():

    # Limited to 100 calls per minute

    pass

    Monitoring

    Flower

    pip install flower

    celery -A myproject flower

    Provides:

  • Real-time task monitoring
  • Worker status
  • Task history
  • Rate limiting controls
  • Custom Monitoring

    from celery.signals import task_success, task_failure

    @task_success.connect

    def task_success_handler(sender, result, **kwargs):

    metrics.increment('celery.task.success', tags=[f'task:{sender.name}'])

    @task_failure.connect

    def task_failure_handler(sender, exception, **kwargs):

    metrics.increment('celery.task.failure', tags=[f'task:{sender.name}'])

    Production Considerations

    Memory Management

    # Restart workers after N tasks to prevent memory leaks

    celery -A myproject worker --max-tasks-per-child=1000

    Error Handling

    @app.task(bind=True, autoretry_for=(TransientError,), retry_backoff=True)

    def robust_task(self):

    # Automatic exponential backoff retry

    pass

    Task Results

    # Don't store results if you don't need them

    @app.task(ignore_result=True)

    def fire_and_forget():

    pass

    Real-World Examples

    At Andalus Judge

  • Submission evaluation in isolated workers
  • Results stored in Redis, then database
  • Separate queues for different problem difficulties
  • At Juniper

  • Email notifications queued
  • Report generation in background
  • Data sync with external systems
  • Common Pitfalls

  • **Database connections** - Use connection pooling
  • **Large payloads** - Pass IDs, not objects
  • **Task dependencies** - Use chains and groups carefully
  • **Monitoring** - Always have visibility into task status
  • Conclusion

    Celery transforms Django from a synchronous framework into a scalable platform. Start simple, monitor closely, and scale as needed.