Hi Everyone
I need some help
I’m currently selfhosting some of my applications on Digital Ocean and i run a container using Portainer CE. I was wondering how you guys keep backups for the applications running on docker.
I’m currently using Digital ocean’s snapshots feature but is there a better way i could use, any help on this is highly appreciated.
For databases and data I use restic-compose-backup because you can use labels in your docker compose files.
For config files I use a git repository.
Almost everything I run is a Docker container, so I made
/var/lib/docker
a btrfs subvolume, and then I make every day incremental snapshots (cron job) and copy them to a secondary disk (also btrfs, using btrbk). Since they are btrfs snapshots they don’t use a lot of disk space and if I really need to rollback an entire day I canI use Kopia. The cli is very easy to use and I have backups scheduled nightly. I backup all external mounts and my entire Portainer directory. Has helped in a pinch to restore busted databases.
I point Kopia cli to backup to a WebDAV location I host on my NAS. For off-site backups I run daily backups of that kopia repository to Google cloud.
I’m not sure if Google cloud is the best off-site backup solution, but I did a price comparison when I first selected it and it was the best capacity for the price that I could find at the time.
Haven’t used digital ocean but I run 2 Proxmox servers and 2 NAS one of each at a different location.
I backup containers and VMs which run in Proxmox to the NAS via NFS and then have a nightly script to copy the backups from there to my remote NAS. It works, haven’t lost any data yet. Still thinking about a third backup in another location as well but money is a thing 🤷
Borgbackup, borgmatic to two backup targets: one in my home and a Hetzner Storage Box. Amongst other things, i include /var/lib/docker/volumes, covering the not-filesystem-bound mounts.
What retention do you run?
I’m setting up the same system, but don’t know how far back I need. Currently considering 7 daily backups, so I can restore to any point within the week, and 2-3 Monthly backups in case there’s an issue I miss for a real long period.entirely up to your feelings i guess - i run 7 dailies and 2 weeklies
Kopia has been great.
Unraid with Duplicacy and Appdata Backup incremental to Backblaze
I use duplicati to backup to a secure off site location. Useful for something like vaultwarden.
Most of mine are lightweight so private repos on git.
For big data I have two NAS that sync on the daily.
Cronjobs to backup important folders to a separate disk
Git repo(s) for services & configs with weekly automated commits and pushes
I do the reverse… all configs are ansible scripts and files and I just push them to the servers. That way I can spin up a new machine from scratch, completely automated within minutes… just the time it takes the machine to set itself up.
As others said, use volume mounts, and I incrementally backup those with borg to minimize storage space requirements
I use rdiff-backup to backup the volumes directory of my VPS to a local machine via VPN. Containers are stored in some public registry anyways. Also use ansible with all the configurations and container settings.
Borg Backup to Hetzner Storage Box.
Use resticker to add an additional backup service to each compose allowing me to customize some pre/post backup actions. Works like a charm 👍
A few hard drives that are stored offsite and rotate every few weeks.