Fix Sidekiq::JobTimeout: Job exceeded timeout of 25 seconds in Sidekiq
This error occurs when a Sidekiq job takes longer than the configured timeout to complete. By default, Sidekiq Pro enforces a 25-second timeout per job. Optimize the job to run faster, increase the timeout for long-running jobs, or break the work into smaller jobs that each complete within the timeout window.
Reading the Stack Trace
Here's what each line means:
- sidekiq (7.2.1) lib/sidekiq/processor.rb:195:in `process': The Sidekiq processor kills the job after it exceeds the configured timeout.
- app/jobs/data_import_job.rb:28:in `perform': The data import job at line 28 is performing a long-running operation that exceeds 25 seconds.
- sidekiq (7.2.1) lib/sidekiq/processor.rb:130:in `process': The main processing loop where the timeout is enforced per job.
Common Causes
1. Processing too many records in one job
A single job tries to process thousands of records instead of batching.
class DataImportJob
include Sidekiq::Job
def perform(file_path)
CSV.foreach(file_path, headers: true) do |row|
User.create!(row.to_h) # Processes 100k rows in one job
end
end
end
2. Slow external API call
The job makes a synchronous HTTP request to a slow external service.
class WebhookJob
include Sidekiq::Job
def perform(url, payload)
response = Net::HTTP.post(URI(url), payload.to_json) # No timeout set
end
end
3. Unoptimized database queries
The job runs complex queries without indexes or batching.
class ReportJob
include Sidekiq::Job
def perform
orders = Order.where('created_at > ?', 1.year.ago)
orders.each do |order| # N+1: loads line_items per order
order.line_items.sum(:total)
end
end
end
The Fix
Break the import into batches of 500 records using insert_all for bulk insertion. Each batch job enqueues the next batch, keeping each job well within the timeout. This pattern ensures no single job runs too long.
class DataImportJob
include Sidekiq::Job
def perform(file_path)
CSV.foreach(file_path, headers: true) do |row|
User.create!(row.to_h)
end
end
end
class DataImportJob
include Sidekiq::Job
sidekiq_options queue: :imports
def perform(file_path, batch_start = 0, batch_size = 500)
rows = CSV.read(file_path, headers: true)
batch = rows[batch_start, batch_size]
return if batch.nil? || batch.empty?
User.insert_all(batch.map(&:to_h))
next_start = batch_start + batch_size
if next_start < rows.size
DataImportJob.perform_async(file_path, next_start, batch_size)
end
end
end
Testing the Fix
require 'rails_helper'
require 'sidekiq/testing'
RSpec.describe DataImportJob do
before { Sidekiq::Testing.fake! }
it 'imports a batch of records' do
file_path = Rails.root.join('spec/fixtures/files/users.csv')
DataImportJob.perform_async(file_path.to_s, 0, 500)
expect(DataImportJob.jobs.size).to eq(1)
end
it 'enqueues next batch when more rows exist' do
Sidekiq::Testing.inline!
file_path = Rails.root.join('spec/fixtures/files/large_users.csv')
DataImportJob.perform_async(file_path.to_s, 0, 10)
expect(User.count).to be > 0
end
end
Run your tests:
bundle exec rspec spec/jobs/data_import_job_spec.rb
Pushing Through CI/CD
git checkout -b fix/sidekiq-job-timeout,git add app/jobs/data_import_job.rb,git commit -m "fix: batch data import job to prevent timeout",git push origin fix/sidekiq-job-timeout
Your CI config should look something like this:
name: CI
on:
pull_request:
branches: [main]
jobs:
test:
runs-on: ubuntu-latest
services:
postgres:
image: postgres:16
env:
POSTGRES_PASSWORD: postgres
ports: ['5432:5432']
redis:
image: redis:7
ports: ['6379:6379']
steps:
- uses: actions/checkout@v4
- uses: ruby/setup-ruby@v1
with:
ruby-version: '3.3'
bundler-cache: true
- run: bin/rails db:setup
- run: bundle exec rspec
The Full Manual Process: 18 Steps
Here's every step you just went through to fix this one bug:
- Notice the error alert or see it in your monitoring tool
- Open the error dashboard and read the stack trace
- Identify the file and line number from the stack trace
- Open your IDE and navigate to the file
- Read the surrounding code to understand context
- Reproduce the error locally
- Identify the root cause
- Write the fix
- Run the test suite locally
- Fix any failing tests
- Write new tests covering the edge case
- Run the full test suite again
- Create a new git branch
- Commit and push your changes
- Open a pull request
- Wait for code review
- Merge and deploy to production
- Monitor production to confirm the error is resolved
Total time: 30-60 minutes. For one bug.
Or Let bugstack Fix It in Under 2 minutes
Every step above? bugstack does it automatically.
Step 1: Install the SDK
gem install bugstack
Step 2: Initialize
require 'bugstack'
Bugstack.init(api_key: ENV['BUGSTACK_API_KEY'])
Step 3: There is no step 3.
bugstack handles everything from here:
- Captures the stack trace and request context
- Pulls the relevant source files from your GitHub repo
- Analyzes the error and understands the code context
- Generates a minimal, verified fix
- Runs your existing test suite
- Pushes through your CI/CD pipeline
- Deploys to production (or opens a PR for review)
Time from error to fix deployed: Under 2 minutes.
Human involvement: zero.
Try bugstack Free →No credit card. 5-minute setup. Cancel anytime.
Deploying the Fix (Manual Path)
- Refactor the job to process data in batches.
- Add job specs verifying batch processing and chaining.
- Monitor job duration in the Sidekiq dashboard.
- Open a pull request.
- Merge and verify job processing times in staging.
Frequently Asked Questions
BugStack runs the fix through your existing test suite, generates additional edge-case tests, and validates that no other components are affected before marking it safe to deploy.
BugStack never pushes directly to production. Every fix goes through a pull request with full CI checks, so your team can review it before merging.
For Sidekiq Enterprise, set the timeout per job class with sidekiq_options timeout: 60. For OSS Sidekiq, the timeout is at the process level via the -t flag.
The job is killed and Sidekiq retries it based on the retry policy. If the job always times out, it will exhaust retries and move to the dead job queue.