83

I just started getting into self hosting using docker compose and I wonder about possible backup solutions. I only have to safe my docker config so far, but I want host files as well. What software and hardware are you using for backup?

top 50 comments
sorted by: hot top controversial new old
[-] YonatanAvhar@programming.dev 34 points 2 years ago

At the moment I'm doing primarily hopes and prayers

[-] Llamajockey@lemmy.world 4 points 2 years ago

I had to upgrade to Hopes&Prayers+ after I ran out of hope and my prayers kept getting return to sender.

[-] webjukebox@mujico.org 3 points 2 years ago* (last edited 2 years ago)

I was in the same boat, until my prayers weren't listened and my hopes are now dead.

I lost some important data from my phone a few days ago. My plan was to backup at night but chaos was that same day in the morning.

load more comments (1 replies)
[-] BitSound@lemmy.world 20 points 2 years ago

I've been using Borg to back my stuff up. It gets backed up to rsync.net, which has good support for Borg:

https://www.rsync.net/products/borg.html

If you're good enough at computers, you can even set up a special borg account with them that's cheaper and has no tech support.

[-] ErwinLottemann@kbin.social 6 points 2 years ago

I was a rsync.net user for many years and recently switched to borgbase, because of how easy it is to manage multiple backup targets.

[-] tables@kbin.social 3 points 2 years ago

I'm on the same boat right now, borg and borgbase.

[-] BitSound@lemmy.world 3 points 2 years ago

That looks cool, and they've got some other nifty looking things like https://www.pikapods.com/. Any idea how stable the company is? I partially like rsync.net because it's pretty unlikely to just disappear someday.

[-] neardeaf@lemm.ee 4 points 2 years ago

Seconding this. On my unRAID host, I run a docker container called “Vorta” that uses Borg as its backend mechanism to backup to my SynologyNAS over NFS. Then on my Syno, run two backup jobs using HyperBackup, one goes to my cousin’s NAS connected via a Site-to-Site OpenVPN connection on our edge devices (Ubiquity Unifi Security Gateway Pro <-> UDM Pro), the other goes to Backblaze B2 Cloud Storage.

OP, let me know if you need any assistance setting something like this up. Gotta share the knowledge over here on Lemmy that we’re still used to searching evil Reddit for.

load more comments (3 replies)
load more comments (1 replies)
[-] 0110010001100010@kbin.social 7 points 2 years ago

Local backup to my Synology NAS every night which is then replicated to another NAS at my folks house through a secure VPN tunnel. Pretty simple and easy to deploy.

load more comments (3 replies)
[-] kaotic@lemmy.world 6 points 2 years ago* (last edited 2 years ago)

I've had excellent luck with Kopia, backing up to Backblaze B2.

At work, I do the same to a local directory in my company provided OneDrive account to keep company data on company resources.

[-] francisco_1844@discuss.online 6 points 2 years ago

Restic for backup - can send backups to S3 and SFTP amongst other target options.

There are S3 (block storage) compatible services, such as Backblaze's B2, which are very affordable for backups.

[-] ComptitiveSubset@lemmy.world 5 points 2 years ago* (last edited 2 years ago)

For app data, Borg as backup/restore software. Backup data is then stored on Hetzner as an offsite backup - super easy and cheap to setup. Also add healthchecks.io to get notified if a backup failed.

Edit: Backup docker compose files and other scripts (without API keys!!!) with git to GitHub.

[-] DataDreadnought@lemmy.one 5 points 2 years ago

I doubt your using NixOS so this config might seem useless but at its core it is a simple systemd timer service and bash scripting.

To convert this to another OS you will use cron to call the script at the time you want. Copy the part between script="" and then change out variables like the location of where docker-compose is stored since its different on NixOS.

Let me explain the script. We start out by defining the backupDate variable, this will be the name of the zip file. As of now that variable would be 2023-07-12. We then go to each folder with a docker-compose.yml file and take it down. You could also replace down with stop if you don't plan on updating each night like I do. I use rclone to connect to Dropbox but rclone supports many providers so check it out and see if it has the one you need. Lastly I use rclone to connect to my Dropbox and delete anything older than 7 days in the backup folder. If you end up going my route and get stuck let me know and I can help out. Good luck.

systemd = {
      timers.docker-backup = {
        wantedBy = [ "timers.target" ];
        partOf = [ "docker-backup.service" ];
        timerConfig.OnCalendar= "*-*-* 3:30:00";
      };
      services.docker-backup = {
        serviceConfig.Type = "oneshot";
        serviceConfig.User = "root";
        script = ''
        backupDate=$(date  +'%F')
        cd /docker/apps/rss
        ${pkgs.docker-compose}/bin/docker-compose down

        cd /docker/apps/paaster
        ${pkgs.docker-compose}/bin/docker-compose down

        cd /docker/no-backup-apps/nextcloud
        ${pkgs.docker-compose}/bin/docker-compose down

        cd /docker/apps/nginx-proxy-manager
        ${pkgs.docker-compose}/bin/docker-compose down

        cd /docker/backups/
        ${pkgs.zip}/bin/zip -r server-backup-$backupDate.zip /docker/apps

        cd /docker/apps/nginx-proxy-manager
        ${pkgs.docker-compose}/bin/docker-compose pull
        ${pkgs.docker-compose}/bin/docker-compose up -d

        cd /docker/apps/paaster
        ${pkgs.docker-compose}/bin/docker-compose pull
        ${pkgs.docker-compose}/bin/docker-compose up -d

        cd /docker/apps/rss
        ${pkgs.docker-compose}/bin/docker-compose pull
        ${pkgs.docker-compose}/bin/docker-compose up -d

        cd /docker/no-backup-apps/nextcloud
        ${pkgs.docker-compose}/bin/docker-compose pull
        ${pkgs.docker-compose}/bin/docker-compose up -d

        cd /docker/backups/
        ${pkgs.rclone}/bin/rclone copy server-backup-$backupDate.zip Dropbox:Server-Backup/
        rm server-backup-$backupDate.zip
        ${pkgs.rclone}/bin/rclone delete --min-age 7d Dropbox:Server-Backup/
        '';
      };
    };

load more comments (1 replies)
[-] lynny@lemmy.world 5 points 2 years ago

Someone on lemmy here suggested Restic, a backup solution written in Go.

I back up to an internal 4TB HDD every 30 minutes. My most important files are stored in an encrypted file storage online in the cloud.

Restic is good stuff.

load more comments (1 replies)
[-] stown@sedd.it 4 points 2 years ago

I host everything on Proxmox VM's so I just take daily snapshots to my NAS

[-] hoodlem@hoodlem.me 4 points 2 years ago

I use Backblaze B2. I was using AWS S3 but the Backblaze pricing is significantly better.

[-] gibnihtmus@lemmy.world 3 points 2 years ago

You should look into s3 deep glacier. It’s $0.001 GB / month. Caveat is there’s a 6 month minimum charge per object.

[-] bier 4 points 2 years ago

my 20 TB storage is currently hosted by Hetzner on a SMB Share with a acompanying server The storage is accessable via NFS/SMB i have a Windows 10 VPS running Backblaze Personal Backup for 7$/Month with unlimited storage while mounting the SMB share as a "Physical Drive" using Dokan because Backblaze B1 doesn't allow backing up Network shares If your Storage is local you can use the win Backup Agent in a Docker container

[-] lnxtx@feddit.nl 4 points 2 years ago

VM instances on the Proxmox VE with native integration with the Proxmox Backup Server (PBS).

For non-VM a little PBS agent.

[-] gobbling871@lemmy.world 4 points 2 years ago* (last edited 2 years ago)

I use restic (and dejadup just to be safe) backing up to multiple cloud storage points. Among these cloud storage points are borgbase.com, backblaze b2 and Microsoft cloud.

[-] jakoma02@czech-lemmy.eu 3 points 2 years ago

So far I have had good experience with kopia. But it is definitly less battle-tested than the other alternatives and I do not use it for too critical stuff yet.

[-] Wxfisch@lemmy.world 3 points 2 years ago

Backblaze B2. Any software that is S3 compatible can use B2 as the target and it’s reasonably priced for the service. I backup all the PCs and services to a Synology NAS and then backup that to B2 (everything except my Plex media, that would be pricy and it’s easy enough to re-rip from disc if needed).

[-] Rosgote@pawb.social 3 points 2 years ago

On Proxmox, I use the built-in system + storing it to my Synology NAS (RS1221+). I use Active Backup for business (filesync) to back up the Proxmox config files, and also backup the husband's PC and my work PC.

[-] sam@lemmy.ca 3 points 2 years ago

raid1 + data duplication

Photos, videos, music, documents, etc.. are available on multiple devices using SyncThing.

[-] ErwinLottemann@kbin.social 6 points 2 years ago

RAID is not a backup. I'm not sure about syncthing, does that count as backup? Have you tried restoring from it?

[-] sam@lemmy.ca 2 points 2 years ago

Sounds like pedantry to me.

[-] lynny@lemmy.world 3 points 2 years ago

If a program screws up and crashes while writing data to your drive, it can take out more than just the data it was dealing with. RAID will simply destroy data on both your drives at the same time, making any data recovery impossible.

[-] tables@kbin.social 3 points 2 years ago

It's not pedantry, it's just that RAID and instant data duplication or synchronization aren't meant to protect you from many of the situations in which you would need a backup. If a drive fails, you can restore the information from wherever you duplicated the data to. If, however, your data is corrupted somehow, the corruption is just duplicated over and you have no way to restore the data to a state before the corruption happened. If you accidentally delete files you didn't want to delete, the deletion is replicated over and, again, no way to restore them. RAID wasn't built to solve the problems a backup tries to solve.

[-] sam@lemmy.ca 2 points 2 years ago

Well I guess my personal definition of backup is wrong.

[-] Greidlbeere@feddit.de 2 points 2 years ago

Never tried syncthing. I will look into it.

[-] SeeJayEmm@lemmy.procrastinati.org 3 points 2 years ago

Desktop: I was using Duplicati for years but I've recently switched to Restic directly to B2. I'm using this powershell script to run it.

Server: I'm also using restic to b2.

I also have a Qnap NAS. I'm synchronizing my replaceable data to crappy old seagate NAS locally. For the irreplaceable data that's using the Qnap backup client to B2.

[-] angrox@feddit.de 3 points 2 years ago

At home I have a Synology NAS for backup of the local desktops. Offsite Backups are done with restic to Blackblaze B2 and to another location.

[-] NotSpez@lemmy.world 3 points 2 years ago

Duplicati. Works like a charm. Supports practically every backend (S3, backblaze, one drive, Google, storj, sia, even Tahoe!

[-] morethanevil@lmy.mymte.de 3 points 2 years ago

I rsync my data once a day to another drive via script. If I accidentaly delete files, I can easily copy them back. Then once a day, rclone makes an encrypted backup to a hetzner storagebox

[-] thisbenzingring@lemmy.sdf.org 3 points 2 years ago

veeam is pretty simple and powerful, the community version is free if you are only using it for a small environment (CPU cores is what it counts)

I havn't used it for docker but it says it is supported

[-] vairfoley@reddthat.com 2 points 2 years ago

I use Veeam to backup shares on my NAS to rotated external drives. I also backup a Linux server.

[-] pirate526@kbin.social 3 points 2 years ago

I run a second Unraid server with a couple of backup-related applications, as well as Duplicati. I have my main server network mounted and run scheduled jobs to both copy data from the main pool to the backup pool, as well as to Backblaze. Nice having the on-site backup as well as the cloud based.

I occasionally burn to 100gb blurays as well for the physical backup.

[-] LanyrdSkynrd@lemmy.world 2 points 2 years ago

Rsync everything besides media to a Storj free account. I also rsync my most important data(docker compose files,config files, home assistant, a few small databases) to Google drive.

[-] RxBrad@lemmy.world 2 points 2 years ago

Rsnapshot to an external USB drive.

Probably not the best, but it works for my little 6TB OpenMediaVault server with some Docker thrown in.

I use rsync with an offsite backup.

[-] hitagi@ani.social 2 points 2 years ago

A lot of services have some kind of way to create backup files. I have cronjobs doing that daily then uploading it to some cloud storage with rclone.

[-] thejevans@lemmy.ml 2 points 2 years ago

Thanks! I just started setting up NixOS on my laptop and I'm planning to use it for servers next. Saving this for later!

[-] p5f20w18k@lemmy.world 2 points 2 years ago

Encrypted backup to google drive weekly from unraid, planning to get a NAS for another backup location

[-] ScandalFan85@feddit.de 2 points 2 years ago

For my workstation I'm using a small script that packs and compresses all relevant directories with tar once a week. The resulting file is then copied to a local backup drive and to my NAS. An encrypted version of that file is sent to an offsite VPS.

For my selfhosted services (on Proxmox) I'm using ProxmoxBackupServer.

[-] hogofwar@lemmy.world 2 points 2 years ago

I don't know if it's a smart solution but I have a HDD in my server that is used just for backups, each night I have rsync automatically moving stuff from multiple locations that I want to back up onto the drive. After that is done I have Kopia backup to B2, with compression, deduplication and encryption. I use healthchecks.io as well to alert me if any of the steps fails to complete (but none of the steps block each other).

[-] mariom@lemmy.world 2 points 2 years ago

For containers (but I use k3s) I use git to store helmfiles and configuration, secrets in ci/cd system.

For the rest - I use autorestic that backups data over ssh and S3.

[-] dr_robot@kbin.social 2 points 2 years ago

ZFS send to a pair of mirrored HDDs on the same machine ever hour and a daily restic backup to S3 storage. Every six months I test and verify the cloud backup.

[-] vegetaaaaaaa@lemmy.world 2 points 2 years ago
load more comments
view more: next ›
this post was submitted on 12 Jul 2023
83 points (100.0% liked)

Selfhosted

46672 readers
1356 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS