Backup SQLite from a Containerized Rails App Deployed with Kamal

Backup SQLite from a Containerized Rails App Deployed with Kamal

February 17, 2025

This post walks through backing up a SQLite database inside a Docker container on a remote server, specifically for a Ruby on Rails application deployed with Kamal. We'll begin with the basic commands, then convert them into a reusable shell script and a Rake task for convenience.

Since Kamal 2 was released, I've been using it to host a few applications and websites for my clients (yes - Rails is fantastic and perfectly fine for plain old marketing websites) on multiple Linux servers.

Most of them use SQLite as database, and Rails + Hotwire + SQLite combo has been a fantastic alternative against the traditional WordPress + MySQL (or even static) sites one might use for a simple marketing site that needs some lightweight CMS functionality, which I'm always happy to hand-code in Ruby.

Anyway, when deploying Rails applications that use SQLite with Kamal, the SQLite database lives inside a Docker container on my production Linux server (although mapped to a persistent storage volume via Kamal). For the first few months, I just used to run a few commands to take the backup of the SQLite file - but recently I 'automated' it with first bash script and then Ruby + Rake, and thought I'd share the solution, in case others might find it useful, and also as a reference for the future me.

First, the manual way...

Here're the steps you might follow to backup the SQLite database running in a container:

  1. Connect to the container running on the remote server
  2. Access the database file inside the container
  3. Create a backup while the database is in use (in the container)
  4. Copy the backup from the container to the host
  5. Transfer it to our local machine

If you just want to get the backup and be done with it, here are the commands you'd run for the above steps:

# Connect to the Server
ssh user@server-ip

# Get the container ID
CONTAINER_ID=$(docker ps --filter name=your-app-web --format '{{.ID}}')

# Create backup inside container
docker exec $CONTAINER_ID sqlite3 /rails/storage/production.sqlite3 ".backup /tmp/backup.sqlite3"

# Copy from container to host
docker cp $CONTAINER_ID:/tmp/backup.sqlite3 /root/backup.sqlite3

# Copy from host to local machine (run it on local machine)
scp root@your-server:/root/backup.sqlite3 ./backup.sqlite3

Let's try to understand what each of the above command is doing:

  • Get the container ID
CONTAINER_ID=$(docker ps --filter name=your-app-web --format '{{.ID}}')

First, we'll get the ID of the container for our web application by running the docker ps command which lists all running Docker containers and filtering them by the container name. Then, we format the output to only display the container ID. This container ID is then stored in the CONTAINER_ID variable.

  • Create backup inside container
docker exec $CONTAINER_ID sqlite3 /rails/storage/production.sqlite3 ".backup /tmp/backup.sqlite3"

This command runs an SQLite backup operation inside our container, using the docker exec command. This backup is stored at /tmp/backup.sqlite3, inside the container.

  • Copy from container to host
docker cp $CONTAINER_ID:/tmp/backup.sqlite3 /root/backup.sqlite3

Next, we copy the backup from the running Docker container to the host server (not your local computer yet! we'll do that in the next step). The docker cp command is used to copy files between the container and the host machine. The backup is stored at /root/backup.sqlite3 on the server.

  • Copy from host to local machine
scp root@your-server:/root/backup.sqlite3 ./backup.sqlite3

Finally, we copy the backup file from the remote server to the local machine using SCP command (Secure Copy Protocol). The backup will be copied to the current working directory on your local computer.

At this step, I hope you were able to generate the backup successfully. However, you don't backup the database once and are done with it. You have to take the backup frequently. It'd be nice to have a simple script that you could run (or even better, a CRON job would run) at periodic intervals.

So, let's scriptify our commands.

Automating with a Shell Script

Let's create a shell script that handles the entire backup process. Create a bash script named /backup-db.sh under the /bin directory in your Rails project. Don't forget to make this script executable:

$ touch bin/backup-db.sh
$ chmod +x bin/backup-db.sh

Here's the backup script you can copy + paste (replace the placeholders with your environment values). I've added some enhancements like logging and saving the backup file with a timestamp value, so it's easier to keep track of the database files. The script also cleans up the backups taken before 30 days + the temporary backups generated on the server, to keep everything neat and tidy.

#!/bin/bash

# Configuration
SERVER_IP="your-server-ip-address"
SERVER_USER="root"
CONTAINER_NAME="your-app-web"
LOCAL_BACKUP_DIR="/path/to/backups"
RETENTION_DAYS=30
LOG_FILE="${LOCAL_BACKUP_DIR}/backup.log"

# Function to log messages
log_message() {
echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" | tee -a "$LOG_FILE"
}

# Create backup directory
mkdir -p "$LOCAL_BACKUP_DIR"

# Get container ID
CONTAINER_ID=$(ssh ${SERVER_USER}@${SERVER_IP} "docker ps --filter name=${CONTAINER_NAME} --format '{{.ID}}'" 2>/dev/null)

if [ -z "$CONTAINER_ID" ]; then
log_message "ERROR: Could not find container ID"
exit 1
fi

# Create backup filename with timestamp
BACKUP_FILENAME="production_$(date +%Y%m%d_%H%M%S).sqlite3"
REMOTE_TEMP_PATH="/rails/storage/backup_${BACKUP_FILENAME}"

# Create backup inside container
ssh ${SERVER_USER}@${SERVER_IP} "docker exec ${CONTAINER_ID} sqlite3 /rails/storage/production.sqlite3 \".backup ${REMOTE_TEMP_PATH}\"" || exit 1

# Copy backup from container to host
ssh ${SERVER_USER}@${SERVER_IP} "docker cp ${CONTAINER_ID}:${REMOTE_TEMP_PATH} /root/${BACKUP_FILENAME}" || exit 1

# Copy backup to local machine
scp ${SERVER_USER}@${SERVER_IP}:/root/${BACKUP_FILENAME} "${LOCAL_BACKUP_DIR}/${BACKUP_FILENAME}" || exit 1

# Cleanup remote files
ssh ${SERVER_USER}@${SERVER_IP} "rm -f /root/${BACKUP_FILENAME} && docker exec ${CONTAINER_ID} rm -f ${REMOTE_TEMP_PATH}"

# Clean up old backups
find "$LOCAL_BACKUP_DIR" -name "production_*.sqlite3" -type f -mtime +${RETENTION_DAYS} -delete

log_message "Backup completed: ${BACKUP_FILENAME}"

view rawbackup.sh hosted with ❤ by GitHub

When you need to take the backup, you can run the backup script as follows:

$ bin/backup-db.sh

After I wrote the script (with some help from Cursor, which is just fantastic with Shell commands), I decided to go one step further and convert the above shell script into a Ruby program, to make it more readable and maintainable. Having them in Ruby code will also let us run the script via a Rake task.

Thinking now, I should have started with the Ruby program, but writing the Shell script was a nice exercise to workout my Shell muscles, with help from AI.

Converting Shell to Ruby Script

Let's create a Ruby class under the /lib folder. Why under /lib? Because taking backup is something not related to your core application - and the lib directory is perfect for such utility programs.

# lib/database_backup.rb

class DatabaseBackup
SERVER_IP = "your-server-ip"
SERVER_USER = "root"
CONTAINER_NAME = "your-app-web"
LOCAL_BACKUP_DIR = "/path/to/backups"
RETENTION_DAYS = 30

def self.run
new.perform_backup
end

def perform_backup
container_id = fetch_container_id
return unless container_id

backup_filename = "production_#{Time.current.strftime('%Y%m%d_%H%M%S')}.sqlite3"
remote_temp_path = "/rails/storage/backup_#{backup_filename}"

if create_backup(container_id, remote_temp_path) &&
copy_to_host(container_id, remote_temp_path, backup_filename) &&
copy_to_local(backup_filename)

cleanup_remote_files(container_id, remote_temp_path, backup_filename)
cleanup_old_backups
end
end

private

def fetch_container_id
stdout, status = run_ssh_command("docker ps --filter name=#{CONTAINER_NAME} --format '{{.ID}}'")

container_id = stdout.strip
if container_id.empty?
log_message "ERROR: Could not find container ID. Is the container running?"
return nil
end

log_message "Found container ID: #{container_id}"
container_id
end

def create_backup(container_id, remote_temp_path)
log_message "Creating backup inside container..."

_, status = run_ssh_command(
"docker exec #{container_id} bash -c 'cd /rails/storage && sqlite3 production.sqlite3 \".backup #{remote_temp_path}\"'"
)

if !status.success?
log_message "ERROR: Failed to create backup inside container"
return false
end

true
end

def copy_to_host(container_id, remote_temp_path, backup_filename)
log_message "Copying backup from container to host..."

_, status = run_ssh_command("docker cp #{container_id}:#{remote_temp_path} /root/#{backup_filename}")

if !status.success?
log_message "ERROR: Failed to copy backup from container to host"
run_ssh_command("docker exec #{container_id} rm -f #{remote_temp_path}")
return false
end

true
end

def copy_to_local(backup_filename)
log_message "Copying backup to local machine..."

_, status = run_command("scp", "#{SERVER_USER}@#{SERVER_IP}:/root/#{backup_filename}", "#{LOCAL_BACKUP_DIR}/#{backup_filename}")

if !status.success?
log_message "ERROR: Failed to copy backup to local machine"
cleanup_remote_files(container_id, remote_temp_path, backup_filename)
return false
end

true
end

def cleanup_remote_files(container_id, remote_temp_path, backup_filename)
log_message "Cleaning up remote temporary files..."
run_ssh_command("rm -f /root/#{backup_filename} && docker exec #{container_id} rm -f #{remote_temp_path}")
end

def verify_and_finalize_backup(backup_filename)
backup_path = File.join(LOCAL_BACKUP_DIR, backup_filename)

if !File.exist?(backup_path) || File.zero?(backup_path)
log_message "ERROR: Backup file is empty or does not exist"
return
end

cleanup_old_backups
backup_size = `du -h "#{backup_path}"`.split.first

log_message "Backup completed successfully! File: #{backup_filename} (Size: #{backup_size})"
log_message "Backup location: #{backup_path}"
end

def cleanup_old_backups
log_message "Cleaning up backups older than #{RETENTION_DAYS} days..."

Dir.glob(File.join(LOCAL_BACKUP_DIR, "production_*.sqlite3")).each do |file|
if File.mtime(file) < Time.now - RETENTION_DAYS * 24 * 60 * 60
File.delete(file)
end
end
end

def run_ssh_command(command)
run_command("ssh", "#{SERVER_USER}@#{SERVER_IP}", command)
end

def run_command(*command)
stdout, stderr, status = Open3.capture3(*command)
[ stdout, status ]
end

def log_message(message)
timestamp = Time.current.strftime("%Y-%m-%d %H:%M:%S")
message = "[#{timestamp}] #{message}"

puts message
File.open(LOG_FILE, "a") { |f| f.puts(message) }
end
end

view rawdatabase_backup.rb hosted with ❤ by GitHub

Finally, I added a Rake task so I could connect the database backup step with other tasks in a pipeline, such as first taking a backup and then restoring the development database with the production data.

# lib/tasks/backup.rake
namespace :db do
desc 'Backup production database from remote server'
task backup: :environment do
DatabaseBackup.run
end
end

Now you can run backup as follows:

# Take a backup
bin/rails db:backup

You can also write a CRON job that will take the backups periodically, but I'll leave that as an exercise for the reader.

Sign up for my newsletter

Let's learn to become better developers.

Comments

No comments yet. Be the first to leave one.

Sign in to leave a comment.