Automated Backups to Zero-Knowledge Cloud: A Developer's Guide
How to automate encrypted backups to a zero-knowledge cloud without exposing your master key. Explore CLI tools and agent-based workflows that encrypt locally before syncing to BitAtlas.
Most backup solutions force you to choose: convenience or privacy. Upload to the cloud unencrypted and your provider owns your data. Encrypt locally and now you're responsible for key management across multiple devices. With zero-knowledge cloud storage, there's a third way—fully automated, fully encrypted, with no server-side access.
The Backup Problem We're Solving
Traditional backup workflows are brittle at the encryption boundary:
- You have a directory you want backed up:
~/important-data - You encrypt it locally (maybe with
7z -porgpg) - You upload the archive manually (scp, rsync, or a cloud UI)
- Tomorrow, you do it again. And it works. Until it doesn't.
The pain points:
- Manual process: Prone to human error. Backups don't happen consistently.
- Key management nightmare: Your encryption key lives on N devices. One compromise leaks the master key.
- No versioning: You back up once per week. If ransomware hits on day 6, you lose 6 days of work.
- Restore friction: When you need to restore, you're decrypting and managing files manually.
Zero-knowledge cloud storage solves this—but only if you can automate the encryption step and keep your master key safe.
How BitAtlas Enables Automated Backups
BitAtlas's architecture is built for this workflow:
- Client-side encryption before upload: Your master key never leaves your device. Files are AES-256-GCM encrypted in the browser or CLI before touching the network.
- Presigned URLs: The server generates a time-limited, signed URL that lets your client upload directly to object storage (S3 or MinIO) without passing data through the application server.
- Metadata-only server: The server stores encrypted keys and file metadata, but not file contents. Even if the server is compromised, the blobs remain locked.
- Scoped API keys for agents: You can give a CLI tool or autonomous agent an API key that's restricted to specific vaults and operations, without exposing your account password.
This design means you can safely automate backups: your secrets stay on your local machine, and the remote service learns nothing about the data being backed up.
Automated Backup Patterns
Pattern 1: CLI Tool with Cron
The simplest approach: a shell script that runs on a schedule.
#!/bin/bash
# backup.sh - runs daily via cron
VAULT_ID="default"
BACKUP_DIR="$HOME/important-data"
BITATLAS_API_KEY="bk_..." # Agent key with limited scope
# Tar and compress
tar czf /tmp/backup-$(date +%Y%m%d).tar.gz "$BACKUP_DIR"
# Upload to BitAtlas using the CLI
bitatlas upload /tmp/backup-$(date +%Y%m%d).tar.gz \
--vault "$VAULT_ID" \
--api-key "$BITATLAS_API_KEY"
# Clean up local temp
rm /tmp/backup-$(date +%Y%m%d).tar.gz
Advantages: Simple, no server required, runs on any cron-enabled system (Linux, macOS, even Windows with WSL).
Key points:
- Use an agent API key, not your account password. This limits the attack surface—the key can only access a specific vault.
- The API key is stored locally in
~/.bitatlas/configwith restricted file permissions (0600). - The CLI handles encryption internally; cron sees no secrets passing through shell pipes.
Pattern 2: Systemd Timer with Backup Verification
For Linux servers, systemd timers are more reliable than cron:
# /etc/systemd/system/bitatlas-backup.service
[Unit]
Description=BitAtlas Backup Service
After=network-online.target
[Service]
Type=oneshot
User=backup
ExecStart=/usr/local/bin/backup-script.sh
StandardOutput=journal
StandardError=journal
[Install]
WantedBy=multi-user.target
# /etc/systemd/system/bitatlas-backup.timer
[Unit]
Description=BitAtlas Backup Timer
Requires=bitatlas-backup.service
[Timer]
OnCalendar=daily
OnBootSec=15min
Persistent=true
[Install]
WantedBy=timers.target
Timers offer:
- Persistent execution: if your machine was off during the scheduled time, it runs when it boots.
- Automatic retry on failure.
- Native logging via journalctl.
- No dependency on a cron daemon.
Pattern 3: Agent-Based Backups (Autonomous Workflows)
This is where it gets powerful. Instead of cron, let an AI agent manage your backups:
// agent-backup.js - runs as an MCP server
import Anthropic from '@anthropic-ai/sdk';
const client = new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
});
const tools = [
{
name: 'list_backup_directories',
description: 'List directories that need backup',
input_schema: { /* ... */ },
},
{
name: 'compress_and_encrypt',
description: 'Compress a directory and encrypt it locally',
input_schema: { /* ... */ },
},
{
name: 'upload_to_bitatlas',
description: 'Upload encrypted archive to BitAtlas vault',
input_schema: { /* ... */ },
},
{
name: 'verify_backup',
description: 'Verify that the backup succeeded by listing vault contents',
input_schema: { /* ... */ },
},
];
async function runBackupAgent() {
const response = await client.messages.create({
model: 'claude-opus-4-7',
max_tokens: 1024,
system: `You are a backup agent. Your job is to:
1. List all directories that need backup
2. Compress and encrypt each one locally
3. Upload the encrypted archives to BitAtlas
4. Verify each backup succeeded
5. Clean up local temp files`,
tools: tools,
messages: [
{
role: 'user',
content: 'Run daily backup now.',
},
],
});
// Process tool calls, handle uploads, verify...
}
runBackupAgent();
Why agents for backups?
- The agent decides what to back up based on change detection.
- It can verify success (list vault, check file count, validate checksums).
- It can adapt: "disk is full, delete yesterday's backup and retry."
- It runs with minimal human supervision.
Security Considerations
API Key Scoping
Never give a backup tool your master password or account key. Create an agent API key with limited permissions:
- Vault: restrict to one backup vault (not your entire data directory).
- Operations: allow
upload,list,deletebut notshareorinvite_user. - Expiration: set a rotation policy (e.g., 90 days).
Key Storage
Store the API key in:
- Linux/macOS:
~/.bitatlas/configwithchmod 600. - CI/CD: environment variable in a secrets manager (GitHub Secrets, GitLab CI Variables).
- Not in: version control, Docker images, or shell history.
Encryption Verification
Always verify the encryption happened client-side:
# Check that the file in BitAtlas is a valid encrypted blob, not raw data
bitatlas info backup-archive.tar.gz.bitatlas | grep "encrypted: true"
The server cannot decrypt it; only your local key can.
Disaster Recovery: Restoring Backups
Automated backups are worthless if you can't restore. Test your restore workflow:
# List backups in vault
bitatlas list --vault backup-vault
# Download and decrypt
bitatlas download backup-archive-20260419.tar.gz.bitatlas \
--output backup.tar.gz
# Extract
tar xzf backup.tar.gz
Pro tip: Run a monthly restore test. It catches problems early and proves your backups work.
Conclusion
Zero-knowledge cloud storage removes the false choice between convenience and privacy. With automated backups, you get:
- Encryption at source: Data is locked before it leaves your device.
- Simple operations: Cron, systemd timers, or agents handle the rest.
- No key management overhead: One master key, unlimited versioned backups.
- True disaster recovery: The server cannot read your backups, and neither can attackers who compromise it.
BitAtlas's MCP server and CLI tools make this workflow frictionless. Set up once, forget about it, and sleep soundly knowing your data is both backed up and private.
Encrypt your agent's data today
BitAtlas gives your AI agents AES-256-GCM encrypted storage with zero-knowledge guarantees. Free tier, no credit card required.
Get Started Free