Fix NoMemoryError: failed to allocate memory (NoMemoryError) in Rails
This error occurs when your Ruby process exhausts available memory. Common causes include loading too many records into memory at once, large file processing without streaming, and memory leaks from global caches or retained references. Use find_each for batch processing, stream large files, and monitor memory with tools like derailed_benchmarks.
Reading the Stack Trace
Here's what each line means:
- app/services/export_service.rb:12:in `generate_csv': The export service loads all records into memory to generate the CSV.
- activerecord (7.1.3) lib/active_record/relation.rb:288:in `to_a': ActiveRecord converts the entire result set to an array, consuming large amounts of memory.
- app/controllers/exports_controller.rb:8:in `create': The controller action triggers the memory-intensive export operation synchronously.
Common Causes
1. Loading all records into memory
Using Model.all.to_a loads every record into a Ruby array, consuming memory proportional to table size.
class ExportService
def generate_csv
orders = Order.all.to_a # Loads millions of rows into memory
CSV.generate do |csv|
orders.each { |o| csv << [o.id, o.total, o.created_at] }
end
end
end
2. String concatenation in loop
Building a large string through concatenation creates many intermediate string objects.
def build_report
result = ''
Order.all.each do |order|
result += order.to_json + "\n" # Creates new string on each iteration
end
result
end
3. Unbounded cache growth
A cache without size limits grows indefinitely and consumes all available memory.
CACHE = {}
def fetch_data(key)
CACHE[key] ||= expensive_computation(key) # Never evicts entries
end
The Fix
Use find_each with batch_size to load records in batches of 1000 instead of all at once. Stream the CSV output to an IO object instead of building a giant string in memory.
class ExportService
def generate_csv
orders = Order.all.to_a
CSV.generate do |csv|
orders.each { |o| csv << [o.id, o.total, o.created_at] }
end
end
end
class ExportService
def generate_csv(io = StringIO.new)
csv = CSV.new(io)
csv << ['ID', 'Total', 'Created At']
Order.find_each(batch_size: 1000) do |order|
csv << [order.id, order.total, order.created_at]
end
io.rewind
io
end
end
Testing the Fix
require 'rails_helper'
RSpec.describe ExportService do
describe '#generate_csv' do
it 'generates CSV without loading all records' do
create_list(:order, 5)
service = ExportService.new
result = service.generate_csv
csv = CSV.parse(result.string, headers: true)
expect(csv.size).to eq(5)
end
it 'includes headers' do
service = ExportService.new
result = service.generate_csv
headers = CSV.parse(result.string).first
expect(headers).to eq(['ID', 'Total', 'Created At'])
end
end
end
Run your tests:
bundle exec rspec spec/services/export_service_spec.rb
Pushing Through CI/CD
git checkout -b fix/rails-memory-bloat,git add app/services/export_service.rb,git commit -m "fix: use find_each and streaming CSV to prevent memory bloat",git push origin fix/rails-memory-bloat
Your CI config should look something like this:
name: CI
on:
pull_request:
branches: [main]
jobs:
test:
runs-on: ubuntu-latest
services:
postgres:
image: postgres:16
env:
POSTGRES_PASSWORD: postgres
ports: ['5432:5432']
steps:
- uses: actions/checkout@v4
- uses: ruby/setup-ruby@v1
with:
ruby-version: '3.3'
bundler-cache: true
- run: bin/rails db:setup
- run: bundle exec rspec
The Full Manual Process: 18 Steps
Here's every step you just went through to fix this one bug:
- Notice the error alert or see it in your monitoring tool
- Open the error dashboard and read the stack trace
- Identify the file and line number from the stack trace
- Open your IDE and navigate to the file
- Read the surrounding code to understand context
- Reproduce the error locally
- Identify the root cause
- Write the fix
- Run the test suite locally
- Fix any failing tests
- Write new tests covering the edge case
- Run the full test suite again
- Create a new git branch
- Commit and push your changes
- Open a pull request
- Wait for code review
- Merge and deploy to production
- Monitor production to confirm the error is resolved
Total time: 30-60 minutes. For one bug.
Or Let bugstack Fix It in Under 2 minutes
Every step above? bugstack does it automatically.
Step 1: Install the SDK
gem install bugstack
Step 2: Initialize
require 'bugstack'
Bugstack.init(api_key: ENV['BUGSTACK_API_KEY'])
Step 3: There is no step 3.
bugstack handles everything from here:
- Captures the stack trace and request context
- Pulls the relevant source files from your GitHub repo
- Analyzes the error and understands the code context
- Generates a minimal, verified fix
- Runs your existing test suite
- Pushes through your CI/CD pipeline
- Deploys to production (or opens a PR for review)
Time from error to fix deployed: Under 2 minutes.
Human involvement: zero.
Try bugstack Free →No credit card. 5-minute setup. Cancel anytime.
Deploying the Fix (Manual Path)
- Replace .all.to_a with find_each batch processing.
- Add streaming for large file generation.
- Profile memory usage with derailed_benchmarks.
- Open a pull request.
- Merge and monitor memory usage in production.
Frequently Asked Questions
BugStack runs the fix through your existing test suite, generates additional edge-case tests, and validates that no other components are affected before marking it safe to deploy.
BugStack never pushes directly to production. Every fix goes through a pull request with full CI checks, so your team can review it before merging.
Use the memory_profiler gem for detailed allocation reports, or derailed_benchmarks to measure memory per request. ObjectSpace can track object counts.
A typical Rails process uses 200-500MB. Set a memory limit around 512MB per worker and use a tool like puma_worker_killer to restart workers that exceed it.