Sometime back, I asked this question on the r/selfhosted (Post) regarding self-hosting password managers on premise. One of the things that the people focused upon was managing backups effectively, I have summarized the intent of the post in a short form blog here if you want a TL;DR version of it.

Post that, I started exploring options of having automated off-site backups for some of the essential data from the services that I am self-hosting.

Currently, I have a solution in place (rclone + bash scripts) to do it, and it’s been working great since past 1 month. Hence, I decided to write about it and share it across if it’s something that might help them set up an automated solution for themselves as well.

Here is the blog: https://akashrajpurohit.com/blog/how-i-safeguard-essential-data-in-my-homelab-with-offsite-backup-on-cloud/

I would also love to know if folks around here have their own solutions or ways of backing up important data in their home-lab.

  • gazoscalvertos@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    I wrote an application from scratch which allows backup jobs to be added / scheduled with the usual incremental, full options. I wanted to use backblaze and do it as efficiently as possible as well as learn a few things on the way. The thing I realised when using a backblaze app on Linux to backup was the list charges become massive when dealing with a huge photo library, videos, millions of files etc. in total I’m backing up around 4-5tb.

    The application I use stores a checksum of the file in a local mysql dB then if a rescan is necessary it’s done locally. It also allows me to store encryption keys locally and not depend on backblaze encryption keys.

    Given I’m storing checksums locally I can also do duplication checks and purges which is great for images.

    Probably overkill and no doubt other applications exist which does the job better but this runs flawlessly and costs around £7.50 a month for my data needs