How It Works Features Pricing Blog Error Guides
Log In Start Free Trial
Celery · Python

Fix SoftTimeLimitExceeded: celery.exceptions.SoftTimeLimitExceeded: SoftTimeLimitExceeded() in Celery

This error occurs when a Celery task exceeds its soft time limit. Celery raises SoftTimeLimitExceeded as an exception inside the task so you can perform cleanup. Fix it by catching the exception to save partial progress, optimizing the task to run faster, breaking it into smaller subtasks, or increasing the time limit if the work genuinely takes longer.

Reading the Stack Trace

Traceback (most recent call last): File "/app/venv/lib/python3.12/site-packages/celery/app/trace.py", line 477, in trace_task R = retval = fun(*args, **kwargs) File "/app/venv/lib/python3.12/site-packages/celery/app/trace.py", line 460, in __protected_call__ return self.run(*args, **kwargs) File "/app/app/tasks.py", line 24, in generate_report rows = db.session.execute(text(heavy_query)).fetchall() File "/app/venv/lib/python3.12/site-packages/billiard/einfo.py", line 54, in __str__ return str(self.exception) celery.exceptions.SoftTimeLimitExceeded: SoftTimeLimitExceeded() [2026-04-10 14:32:01,234: ERROR/ForkPoolWorker-1] Task app.tasks.generate_report[abc123] raised: SoftTimeLimitExceeded()

Here's what each line means:

Common Causes

1. Task processes too much data at once

The task tries to process an entire dataset in one go instead of batching, causing it to exceed the time limit.

@celery.task(soft_time_limit=300)
def generate_report(start_date, end_date):
    rows = db.session.execute(text('SELECT * FROM orders')).fetchall()  # millions of rows
    return process_all(rows)

2. No exception handling for soft limit

The task does not catch SoftTimeLimitExceeded, so partial work is lost and the task fails without cleanup.

@celery.task(soft_time_limit=60)
def send_bulk_emails(user_ids):
    for uid in user_ids:
        send_email(uid)  # no try/except, progress lost if limit hit

3. Soft time limit too low for the workload

The time limit is set to a value that is not realistic for the task's normal execution time.

@celery.task(soft_time_limit=10)  # 10 seconds for a heavy report
def generate_report():
    ...

The Fix

Break the work into batches and catch SoftTimeLimitExceeded to save progress and reschedule remaining work. This ensures partial results are not lost and the task can resume from where it left off. Increase the soft_time_limit to a more realistic value for each batch.

Before (broken)
@celery.task(soft_time_limit=300)
def generate_report(start_date, end_date):
    rows = db.session.execute(text('SELECT * FROM orders')).fetchall()
    return process_all(rows)
After (fixed)
from celery.exceptions import SoftTimeLimitExceeded

@celery.task(bind=True, soft_time_limit=600, time_limit=660)
def generate_report(self, start_date, end_date, offset=0, batch_size=1000):
    try:
        rows = db.session.execute(
            text('SELECT * FROM orders LIMIT :limit OFFSET :offset'),
            {'limit': batch_size, 'offset': offset}
        ).fetchall()
        result = process_batch(rows)
        if len(rows) == batch_size:
            # Schedule next batch
            generate_report.delay(start_date, end_date, offset + batch_size, batch_size)
        return result
    except SoftTimeLimitExceeded:
        # Save progress and reschedule remaining work
        save_partial_result(offset)
        generate_report.delay(start_date, end_date, offset, batch_size)
        return {'status': 'partial', 'offset': offset}

Testing the Fix

import pytest
from unittest.mock import patch, MagicMock
from celery.exceptions import SoftTimeLimitExceeded
from app.tasks import generate_report

@pytest.fixture
def celery_app():
    from app import create_app
    app = create_app()
    app.config['CELERY_ALWAYS_EAGER'] = True
    return app

def test_report_processes_batch(celery_app):
    with celery_app.app_context():
        result = generate_report('2026-01-01', '2026-03-31', offset=0, batch_size=10)
        assert result is not None

@patch('app.tasks.db.session.execute')
def test_soft_time_limit_saves_progress(mock_execute, celery_app):
    mock_execute.side_effect = SoftTimeLimitExceeded()
    with celery_app.app_context():
        result = generate_report('2026-01-01', '2026-03-31')
        assert result['status'] == 'partial'

def test_batch_schedules_next_chunk(celery_app):
    with celery_app.app_context():
        with patch('app.tasks.generate_report.delay') as mock_delay:
            generate_report('2026-01-01', '2026-03-31', offset=0, batch_size=1)
            # If there are more rows, delay should be called
            # This depends on test data availability

Run your tests:

pytest tests/ -v

Pushing Through CI/CD

git checkout -b fix/celery-task-timeout-batching,git add app/tasks.py tests/test_tasks.py,git commit -m "fix: batch report generation and handle SoftTimeLimitExceeded",git push origin fix/celery-task-timeout-batching

Your CI config should look something like this:

name: CI
on:
  pull_request:
    branches: [main]
jobs:
  test:
    runs-on: ubuntu-latest
    services:
      redis:
        image: redis:7-alpine
        ports:
          - 6379:6379
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: '3.12'
          cache: 'pip'
      - run: pip install -r requirements.txt
      - run: pytest tests/ -v --tb=short
        env:
          CELERY_BROKER_URL: redis://localhost:6379/0

The Full Manual Process: 18 Steps

Here's every step you just went through to fix this one bug:

  1. Notice the error alert or see it in your monitoring tool
  2. Open the error dashboard and read the stack trace
  3. Identify the file and line number from the stack trace
  4. Open your IDE and navigate to the file
  5. Read the surrounding code to understand context
  6. Reproduce the error locally
  7. Identify the root cause
  8. Write the fix
  9. Run the test suite locally
  10. Fix any failing tests
  11. Write new tests covering the edge case
  12. Run the full test suite again
  13. Create a new git branch
  14. Commit and push your changes
  15. Open a pull request
  16. Wait for code review
  17. Merge and deploy to production
  18. Monitor production to confirm the error is resolved

Total time: 30-60 minutes. For one bug.

Or Let bugstack Fix It in Under 2 minutes

Every step above? bugstack does it automatically.

Step 1: Install the SDK

pip install bugstack

Step 2: Initialize

import bugstack

bugstack.init(api_key=os.environ["BUGSTACK_API_KEY"])

Step 3: There is no step 3.

bugstack handles everything from here:

  1. Captures the stack trace and request context
  2. Pulls the relevant source files from your GitHub repo
  3. Analyzes the error and understands the code context
  4. Generates a minimal, verified fix
  5. Runs your existing test suite
  6. Pushes through your CI/CD pipeline
  7. Deploys to production (or opens a PR for review)

Time from error to fix deployed: Under 2 minutes.

Human involvement: zero.

Try bugstack Free →

No credit card. 5-minute setup. Cancel anytime.

Deploying the Fix (Manual Path)

  1. Run pytest locally to confirm batching and timeout handling work correctly.
  2. Open a pull request with the task refactor and tests.
  3. Wait for CI checks to pass on the PR.
  4. Have a teammate review and approve the PR.
  5. Merge to main and monitor task durations and completion rates in staging.

Frequently Asked Questions

BugStack tests the task with normal data, simulates SoftTimeLimitExceeded, verifies partial progress is saved, and runs your full suite before marking it safe.

BugStack never pushes directly to production. Every fix goes through a pull request with full CI checks, so your team can review it before merging.

SoftTimeLimitExceeded is a catchable exception that lets you clean up. The hard time limit kills the worker process with SIGKILL after the grace period, so always set it slightly higher than the soft limit.

For simple sequential batching, scheduling the next batch with .delay() is fine. For complex workflows with dependencies, use Celery chains, chords, or groups.