Ansible role for managing Duplicity backups with support for multiple storage backends, flexible scheduling, GPG encryption, and optional Zabbix monitoring with auto-detection.
- Multiple Backup Destinations: S3, SFTP, Local filesystem
- Flexible Encryption: GPG key-based or symmetric passphrase encryption
- Configurable Scheduling: Cron-based with preset options (hourly, daily, weekly, monthly)
- Retention Policies: Time-based cleanup and chain-based rotation
- Zabbix Auto-Detection: Automatically configures monitoring when Zabbix agent is present
- Restore Utilities: Comprehensive scripts for full, partial, and point-in-time recovery
- Multi-Distribution Support: Debian 11/12, Ubuntu 20.04/22.04/24.04
- Ansible 2.12 or higher
- Supported operating systems:
- Debian 12 (Bookworm), 13 (Trixie)
- Ubuntu 20.04 (Focal), 22.04 (Jammy), 24.04 (Noble)
ansible-galaxy install oxess.duplicityansible-galaxy install git+https://github.com/oxess/ansible-backup-duplicity.git,main- hosts: servers
roles:
- role: oxess.duplicity
vars:
duplicity_destination: "file:///mnt/backup"
duplicity_passphrase: "{{ vault_backup_passphrase }}"
duplicity_include_paths:
- /etc
- /var/www
- /home- hosts: servers
roles:
- role: oxess.duplicity
vars:
duplicity_destination: "boto3+s3://my-bucket/backups"
duplicity_passphrase: "{{ vault_backup_passphrase }}"
duplicity_aws_access_key_id: "{{ vault_aws_key }}"
duplicity_aws_secret_access_key: "{{ vault_aws_secret }}"
duplicity_s3_region_name: "eu-central-1" # Your bucket's region
duplicity_include_paths:
- /etc
- /var/www- hosts: servers
roles:
- role: oxess.duplicity
vars:
duplicity_destination: "sftp://backup-user@backup.example.com/backups"
duplicity_passphrase: "{{ vault_backup_passphrase }}"
duplicity_include_paths:
- /etc
- /var/wwwWhen using S3 as a backup destination, Duplicity requires specific permissions to operate. Below is the minimal IAM policy needed:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": "arn:aws:s3:::your-bucket-name"
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject"
],
"Resource": "arn:aws:s3:::your-bucket-name/your-backup-path/*"
}
]
}s3:ListBucket: Required to list existing backup sets and verify backup chainss3:GetBucketLocation: Needed to determine the bucket's region for proper API callss3:GetObject: Required to download backup archives during restore operations and to verify existing backupss3:PutObject: Required to upload new backup archivess3:DeleteObject: Required for cleanup operations when removing old backups according to retention policies
-
Create an IAM user specifically for backups:
aws iam create-user --user-name duplicity-backup
-
Create and attach the policy:
aws iam create-policy --policy-name DuplicityBackupPolicy --policy-document file://duplicity-policy.json aws iam attach-user-policy --user-name duplicity-backup --policy-arn arn:aws:iam::YOUR_ACCOUNT_ID:policy/DuplicityBackupPolicy
-
Create access keys:
aws iam create-access-key --user-name duplicity-backup
For S3-compatible storage providers, use the same permissions structure but with the appropriate endpoint:
- hosts: servers
roles:
- role: oxess.duplicity
vars:
duplicity_destination: "boto3+s3://bucket/backups"
duplicity_s3_endpoint_url: "https://your-minio-server:9000"
duplicity_aws_access_key_id: "{{ vault_minio_access_key }}"
duplicity_aws_secret_access_key: "{{ vault_minio_secret_key }}"Ansible Vault should be used to protect sensitive information like passwords, API keys, and passphrases. Here's a step-by-step guide:
Create an encrypted vault file to store your sensitive variables:
ansible-vault create group_vars/all/vault.ymlYou'll be prompted to create a vault password. Remember this password as you'll need it to decrypt the file.
Edit the vault file and add your sensitive data:
ansible-vault edit group_vars/all/vault.ymlAdd the following variables (example for AWS S3):
# Encryption passphrase
vault_backup_passphrase: "your-strong-backup-passphrase-here"
# AWS Credentials
vault_aws_access_key_id: "AKIAIOSFODNN7EXAMPLE"
vault_aws_secret_access_key: "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
# For SFTP (if needed)
vault_sftp_password: "your-sftp-password"
# For GPG encryption (if using)
vault_gpg_passphrase: "your-gpg-key-passphrase"Create a playbook that references the vault variables:
---
# playbook.yml
- hosts: backup_servers
become: yes
roles:
- role: oxess.duplicity
vars:
# S3 destination using vault variables
duplicity_destination: "boto3+s3://your-bucket-name/backups/{{ inventory_hostname }}"
# Use vault-encrypted variables
duplicity_passphrase: "{{ vault_backup_passphrase }}"
duplicity_aws_access_key_id: "{{ vault_aws_access_key_id }}"
duplicity_aws_secret_access_key: "{{ vault_aws_secret_access_key }}"
# Backup configuration
duplicity_include_paths:
- /etc
- /var/www
- /home
# Schedule daily backups at 2 AM
duplicity_cron_schedule: "daily"
duplicity_cron_hour: "2"
duplicity_cron_minute: "0"
# Retention: keep 4 weeks of backups
duplicity_remove_older_than: "4W"
duplicity_full_if_older_than: "1W"Execute the playbook with vault password:
# Option 1: Enter vault password interactively
ansible-playbook -i inventory playbook.yml --ask-vault-pass
# Option 2: Use a vault password file
echo "your-vault-password" > .vault_pass
chmod 600 .vault_pass
ansible-playbook -i inventory playbook.yml --vault-password-file .vault_pass
# Option 3: Set environment variable (for CI/CD)
export ANSIBLE_VAULT_PASSWORD_FILE=.vault_pass
ansible-playbook -i inventory playbook.yml- Create vault with AWS credentials:
ansible-vault create group_vars/all/vault.ymlContent:
vault_backup_passphrase: "MySecureBackup#Pass2024!"
vault_aws_access_key_id: "AKIAIOSFODNN7EXAMPLE"
vault_aws_secret_access_key: "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
vault_aws_bucket: "your-bucket-name"- Create the playbook (
backup-setup.yml):
---
- name: Configure Duplicity backups to AWS S3
hosts: production
become: yes
vars:
backup_bucket: "{{ vault_aws_bucket }}"
backup_prefix: "{{ ansible_hostname }}/{{ ansible_date_time.year }}"
roles:
- role: oxess.duplicity
vars:
# S3 destination with dynamic path
duplicity_destination: "boto3+s3://{{ backup_bucket }}/{{ backup_prefix }}"
# Encryption
duplicity_encryption_method: "symmetric"
duplicity_passphrase: "{{ vault_backup_passphrase }}"
# AWS credentials from vault
duplicity_aws_access_key_id: "{{ vault_aws_access_key_id }}"
duplicity_aws_secret_access_key: "{{ vault_aws_secret_access_key }}"
# What to backup
duplicity_include_paths:
- /etc
- /var/www
- /home
- /var/lib/mysql
duplicity_exclude_patterns:
- "**/cache/**"
- "**/*.log"
- "**/tmp/**"
- "**/.git/**"
# Schedule: Daily at 3 AM
duplicity_cron_enabled: true
duplicity_cron_schedule: "daily"
duplicity_cron_hour: "3"
duplicity_cron_minute: "0"
# Retention policy
duplicity_remove_older_than: "30D"
duplicity_full_if_older_than: "7D"
duplicity_keep_full_chains: 3
# Performance
duplicity_volsize: 500
# Monitoring
duplicity_zabbix_enabled: "auto"
duplicity_log_enabled: true- Deploy the configuration:
# Test run (check mode)
ansible-playbook -i inventory backup-setup.yml --ask-vault-pass --check
# Actual deployment
ansible-playbook -i inventory backup-setup.yml --ask-vault-pass
# Verify the backup is working
ansible all -i inventory -m command -a "sudo duplicity-status" --ask-vault-pass-
Never commit unencrypted vault files - Add to
.gitignore:*.vault .vault_pass vault_password.txt -
Use separate vaults for different environments:
group_vars/production/vault.yml group_vars/staging/vault.yml group_vars/development/vault.yml -
Rotate credentials regularly - Re-encrypt vault files:
ansible-vault rekey group_vars/all/vault.yml
-
View vault contents without editing:
ansible-vault view group_vars/all/vault.yml
-
Encrypt existing files:
ansible-vault encrypt existing_vars.yml
-
Decrypt temporarily for editing:
ansible-vault decrypt vault.yml # Edit the file ansible-vault encrypt vault.yml
For automated deployments, store the vault password securely:
GitHub Actions example:
- name: Run Ansible Playbook
env:
ANSIBLE_VAULT_PASSWORD: ${{ secrets.ANSIBLE_VAULT_PASSWORD }}
run: |
echo "$ANSIBLE_VAULT_PASSWORD" > .vault_pass
ansible-playbook -i inventory playbook.yml --vault-password-file .vault_pass
rm .vault_passGitLab CI example:
deploy:
script:
- echo "$ANSIBLE_VAULT_PASSWORD" > .vault_pass
- ansible-playbook -i inventory playbook.yml --vault-password-file .vault_pass
- rm .vault_pass
variables:
ANSIBLE_VAULT_PASSWORD: $CI_ANSIBLE_VAULT_PASSWORD| Variable | Description |
|---|---|
duplicity_destination |
Backup destination URL (s3://, sftp://, file://) |
duplicity_passphrase |
Encryption passphrase (use Ansible Vault) |
| Variable | Default | Description |
|---|---|---|
duplicity_include_paths |
[/etc, /var/www] |
Directories to backup |
duplicity_exclude_patterns |
["**/*.log", ...] |
Glob patterns to exclude |
| Variable | Default | Description |
|---|---|---|
duplicity_encryption_method |
symmetric |
gpg, symmetric, or none |
duplicity_passphrase |
"" |
Encryption passphrase |
duplicity_gpg_key |
"" |
GPG key ID for asymmetric encryption |
duplicity_sign_key |
"" |
GPG key for signing backups |
| Variable | Default | Description |
|---|---|---|
duplicity_aws_access_key_id |
"" |
AWS/S3 access key |
duplicity_aws_secret_access_key |
"" |
AWS/S3 secret key |
duplicity_s3_endpoint_url |
"" |
Custom S3 endpoint (MinIO, etc.) |
duplicity_sftp_password |
"" |
SFTP password (prefer SSH keys) |
| Variable | Default | Description |
|---|---|---|
duplicity_cron_enabled |
true |
Enable cron job |
duplicity_cron_hour |
"3" |
Hour to run backup |
duplicity_cron_minute |
"0" |
Minute to run backup |
duplicity_cron_schedule |
"" |
Preset: hourly, daily, weekly, monthly |
| Variable | Default | Description |
|---|---|---|
duplicity_remove_older_than |
"4W" |
Remove backups older than (1D, 1W, 1M) |
duplicity_full_if_older_than |
"1W" |
Force full backup after this period |
duplicity_keep_full_chains |
2 |
Number of full chains to keep |
| Variable | Default | Description |
|---|---|---|
duplicity_zabbix_enabled |
"auto" |
auto, true, or false |
duplicity_zabbix_agent_conf_dir |
/etc/zabbix/zabbix_agentd.conf.d |
Zabbix agent config directory |
duplicity_zabbix_check_hours |
24 |
Hours to check for successful backup |
| Variable | Default | Description |
|---|---|---|
duplicity_volsize |
200 |
Volume size in MB |
duplicity_max_retries |
3 |
Max retry attempts on failure |
duplicity_log_enabled |
true |
Enable logging to file |
duplicity_log_to_syslog |
true |
Log to syslog |
The role installs several utility scripts:
| Script | Description |
|---|---|
duplicity-backup |
Main backup script (run by cron) |
duplicity-status |
Show backup collection status |
duplicity-restore |
Restore utility with multiple options |
duplicity-list |
List files in backup archive |
sudo duplicity-backupduplicity-statusduplicity-restore list# Restore to /restore directory
duplicity-restore restore
# Restore to custom directory
duplicity-restore restore / /tmp/full-restore# Restore /etc/nginx to /tmp/nginx-restore
duplicity-restore restore /etc/nginx /tmp/nginx-restore# Restore from 3 days ago
duplicity-restore restore --time 3D /etc /tmp/etc-restore
# Restore from specific date
duplicity-restore restore --time 2024-01-15 /var/www /tmp/www-restoreduplicity-restore file /etc/nginx/nginx.conf /tmp/nginx.conf.bakduplicity-restore verifyduplicity-restore restore --dry-run /etc# List all files
duplicity-list
# List files under /etc
duplicity-list /etc
# List files from 3 days ago
duplicity-list --time 3D /var/www-
Install duplicity on new server:
apt install duplicity python3-boto3 python3-paramiko gnupg
-
Copy credentials (GPG keys or set environment variables):
export PASSPHRASE="your-backup-passphrase" export AWS_ACCESS_KEY_ID="your-key" export AWS_SECRET_ACCESS_KEY="your-secret"
-
List available backups:
duplicity collection-status s3://bucket/path
-
Restore:
duplicity restore s3://bucket/path /restore
-
Verify restored files and move to final locations.
When Zabbix agent is detected (or explicitly enabled), the role configures monitoring with these UserParameters:
| Key | Description |
|---|---|
duplicity.backup.success_count |
Successful backups in last N hours |
duplicity.backup.running |
Is backup currently running (1/0) |
duplicity.backup.last_success |
Timestamp of last successful backup |
duplicity.backup.error_count |
Errors in last 24 hours |
duplicity.backup.collection_count |
Number of backup sets |
- Credentials: Store in Ansible Vault, deployed with mode 0600
- Scripts: Owned by root, mode 0750
- Config Directory: Mode 0750
- Lock File: Prevents concurrent backup runs
- GPG Keys: Managed externally, referenced by ID
Run Molecule tests:
pip install molecule molecule-plugins[docker]
molecule testSymptom:
Attempt of list Nr. 1 failed. ClientError: An error occurred (403) when calling the HeadBucket operation: Forbidden
Attempt of list Nr. 2 failed. AttributeError: 'NoneType' object has no attribute 'objects'
Cause: Duplicity 3.0+ with the boto3 backend requires a different URL format. The hostname (s3.amazonaws.com) should NOT be included in the destination URL.
Solution: Change your duplicity_destination from the old format to the new format:
# Wrong (old format with hostname)
duplicity_destination: "boto3+s3://s3.amazonaws.com/my-bucket/path"
duplicity_destination: "s3://s3.amazonaws.com/my-bucket/path"
# Correct (new format - bucket name only)
duplicity_destination: "boto3+s3://my-bucket/path"Also ensure duplicity_s3_region_name is set to your bucket's region:
duplicity_s3_region_name: "eu-central-1"Verification: After fixing, run duplicity-status on the server to confirm the backup collection is accessible.
Symptom: 401 Unauthorized or credentials-related errors.
Diagnosis: Test credentials directly on the server:
source /etc/duplicity/backup.env
python3 -c "
import boto3
s3 = boto3.client('s3')
print(s3.head_bucket(Bucket='your-bucket-name'))
"Common causes:
- Wrong
AWS_ACCESS_KEY_IDorAWS_SECRET_ACCESS_KEY - IAM user lacks required permissions (see AWS S3 Permissions)
- Bucket policy denying access
Diagnosis: Check logs:
# Check syslog
grep duplicity /var/log/syslog
# Check dedicated log file (if enabled)
cat /var/log/duplicity/backup.log
# Run backup manually with verbose output
duplicity-backupSymptom: Redirects or region-related errors.
Solution: Always specify the region explicitly:
duplicity_s3_region_name: "us-east-1" # or your bucket's regionFind your bucket's region:
aws s3api get-bucket-location --bucket your-bucket-nameSymptom: Cannot execute duplicity-backup or other scripts.
Solution: Scripts should be owned by root with mode 0750:
ls -la /etc/duplicity/scripts/
# Should show: -rwxr-x--- root rootRe-run the Ansible role to fix permissions.
MIT
oxess