Skip to content

Backup & Restore

A first-class backup/restore feature is planned for a future release. In the meantime, back up the underlying storage directly.

The /data directory is the single source of truth for a club instance — SQLite database, package tarballs, dartdoc HTML, and Flutter SDK installs all live under it. If you keep /data as a Docker volume (or host bind-mount), a fresh container started against the same volume picks everything up automatically: the DB opens, tarballs are served from disk, and SDK directories are rediscovered and re-initialised on startup. The admin UI also surfaces a popup if any database entries point at missing tarballs after a partial restore, so you can clean up stranded records.

For off-host backups, a complete snapshot covers two components: the metadata store (database) and the blob store (package archives). Both must be captured to restore a club instance to a different host.

What to Back Up

ComponentContainsBackend Options
Metadata storePackages, versions, users, tokens, publishers, audit log, scores, download counts, server settingsSQLite file (default) or PostgreSQL database
Blob store.tar.gz package archivesFilesystem directory, S3 bucket, or GCS bucket

Metadata Store Backup

SQLite’s .backup command creates a consistent snapshot even while the server is running (thanks to WAL mode).

Backup:

Terminal window
sqlite3 /data/club.db ".backup /backups/club-$(date +%Y%m%d).db"

Restore:

  1. Stop the club server.

  2. Replace the database file:

    Terminal window
    cp /backups/club-20260409.db /data/club.db
  3. Start the club server. No manual migration step is needed — the server reconciles its schema on startup.


Blob Store Backup

Use tar for a compressed backup of the packages directory, or rsync for incremental backups.

Full backup with tar:

Terminal window
tar -czf /backups/packages-$(date +%Y%m%d).tar.gz -C /data packages/

Incremental backup with rsync:

Terminal window
rsync -av --delete /data/packages/ /backups/packages/

Restore:

Terminal window
# From tar
tar -xzf /backups/packages-20260409.tar.gz -C /data
# From rsync
rsync -av /backups/packages/ /data/packages/

Automated Backup with Cron

Full Backup Script

Create a script that backs up both the metadata and blob stores:

/opt/club/backup.sh
#!/bin/bash
# Full club backup script
set -euo pipefail
BACKUP_DIR="/backups/club"
DATE=$(date +%Y%m%d-%H%M%S)
RETAIN_DAYS=30
mkdir -p "$BACKUP_DIR"
echo "[$(date)] Starting club backup..."
# -------------------------------------------------------
# 1. Back up metadata store
# -------------------------------------------------------
# SQLite
if [ -f /data/club.db ]; then
echo "Backing up SQLite database..."
sqlite3 /data/club.db ".backup $BACKUP_DIR/club-$DATE.db"
echo "SQLite backup complete: $BACKUP_DIR/club-$DATE.db"
fi
# PostgreSQL (uncomment if using postgres)
# echo "Backing up PostgreSQL database..."
# pg_dump -Fc -h db.example.com -U club club \
# > "$BACKUP_DIR/club-$DATE.dump"
# echo "PostgreSQL backup complete."
# -------------------------------------------------------
# 2. Back up blob store
# -------------------------------------------------------
# Filesystem
if [ -d /data/packages ]; then
echo "Backing up package archives..."
tar -czf "$BACKUP_DIR/packages-$DATE.tar.gz" -C /data packages/
echo "Archive backup complete: $BACKUP_DIR/packages-$DATE.tar.gz"
fi
# S3 (uncomment if using S3)
# echo "Syncing S3 bucket..."
# aws s3 sync s3://club-packages "$BACKUP_DIR/packages-$DATE/"
# echo "S3 sync complete."
# -------------------------------------------------------
# 3. Clean up old backups
# -------------------------------------------------------
echo "Removing backups older than $RETAIN_DAYS days..."
find "$BACKUP_DIR" -type f -mtime +"$RETAIN_DAYS" -delete
echo "[$(date)] Backup complete."

Cron Schedule

Set up a daily backup at 2:00 AM:

Terminal window
# Edit crontab
crontab -e
# Add this line:
0 2 * * * /opt/club/backup.sh >> /var/log/club-backup.log 2>&1

Docker Backup

If club runs in Docker, execute the backup script inside the container or mount the data volumes:

Terminal window
# Backup from outside the container using mounted volumes
docker run --rm \
-v club-data:/data:ro \
-v /backups:/backups \
alpine sh -c '
apk add --no-cache sqlite
sqlite3 /data/club.db ".backup /backups/club-$(date +%Y%m%d).db"
tar -czf /backups/packages-$(date +%Y%m%d).tar.gz -C /data packages/
'

Full Restore Procedure

  1. Stop the club server.

    Terminal window
    docker compose down
    # or: systemctl stop club
  2. Restore the metadata store.

    Terminal window
    # SQLite
    cp /backups/club-20260409.db /data/club.db
    # PostgreSQL
    # pg_restore -h db.example.com -U club -d club /backups/club-20260409.dump
  3. Restore the blob store.

    Terminal window
    # Filesystem
    tar -xzf /backups/packages-20260409.tar.gz -C /data
    # S3
    # aws s3 sync /backups/packages/ s3://club-packages
  4. Start the club server.

    Terminal window
    docker compose up -d
    # or: systemctl start club
  5. Verify the restore by checking the health endpoint and browsing packages:

    Terminal window
    curl https://packages.example.com/api/v1/health

Backup Verification

Periodically test your backups by restoring to a temporary instance:

Terminal window
# Start a temporary club instance with restored data
docker run --rm -p 9090:8080 \
-v /backups/club-20260409.db:/data/club.db \
-v /tmp/restore-packages:/data/packages \
-e SERVER_URL=http://localhost:9090 \
-e JWT_SECRET=test-secret-at-least-32-characters-long \
club:latest
# Verify packages are accessible
curl http://localhost:9090/api/v1/health
curl -H "Authorization: Bearer <token>" http://localhost:9090/api/packages