175
submitted 2 months ago* (last edited 2 months ago) by CatLikeLemming to c/selfhosted@lemmy.world

I'm planning on setting up a nas/home server (primarily storage with some jellyfin and nextcloud and such mixed in) and since it is primarily for data storage I'd like to follow the data preservation rules of 3-2-1 backups. 3 copies on 2 mediums with 1 offsite - well actually I'm more trying to go for a 2-1 with 2 copies and one offsite, but that's besides the point. Now I'm wondering how to do the offsite backup properly.

My main goal would be to have an automatic system that does full system backups at a reasonable rate (I assume daily would be a bit much considering it's gonna be a few TB worth of HDDs which aren't exactly fast, but maybe weekly?) and then have 2-3 of those backups offsite at once as a sort of version control, if possible.

This has two components, the local upload system and the offsite storage provider. First the local system:

What is good software to encrypt the data before/while it's uploaded?

While I'd preferably upload the data to a provider I trust, accidents happen, and since they don't need to access the data, I'd prefer them not being able to, maliciously or not, so what is a good way to encrypt the data before it leaves my system?

What is a good way to upload the data?

After it has been encrypted, it needs to be sent. Is there any good software that can upload backups automatically on regular intervals? Maybe something that also handles the encryption part on the way?

Then there's the offsite storage provider. Personally I'd appreciate as many suggestions as possible, as there is of course no one size fits all, so if you've got good experiences with any, please do send their names. I'm basically just looking for network attached drives. I send my data to them, I leave it there and trust it stays there, and in case too many drives in my system fail for RAID-Z to handle, so 2, I'd like to be able to get the data off there after I've replaced my drives. That's all I really need from them.

For reference, this is gonna be my first NAS/Server/Anything of this sort. I realize it's mostly a regular computer and am familiar enough with Linux, so I can handle that basic stuff, but for the things you wouldn't do with a normal computer I am quite unfamiliar, so if any questions here seem dumb, I apologize. Thank you in advance for any information!

top 50 comments
sorted by: hot top controversial new old
[-] huquad@lemmy.ml 41 points 2 months ago

Syncthing to a pi at my parents place.

[-] AtariDump@lemmy.world 13 points 2 months ago

But doesn’t that sync in real-time? Making it not a true backup?

[-] huquad@lemmy.ml 9 points 2 months ago

Agreed. I have it configured on a delay and with multiple file versions. I also have another pi running rsnapshot (rsync tool).

[-] AtariDump@lemmy.world 2 points 2 months ago
[-] huquad@lemmy.ml 7 points 2 months ago

For the delay, I just reduce how often it checks for new files instead of instantaneously.

[-] rumba@lemmy.zip 4 points 2 months ago

Edit the share, enable file versioning, choose which flavor.

[-] Appoxo@lemmy.dbzer0.com 2 points 2 months ago

In theory you could setup a cron with a docker compose to fire up a container, sync and once all endpoint jobs are synced to shut down.
As it seemingly has an API it should be possible.

load more comments (2 replies)
[-] Malatesta@lemmy.world 12 points 2 months ago

Low power server in a friends basement running syncthing

[-] SorteKanin@feddit.dk 4 points 2 months ago

A pi with multiple terabytes of storage?

[-] huquad@lemmy.ml 9 points 2 months ago* (last edited 2 months ago)

My most critical data is only ~2-3TB, including backups of all my documents and family photos, so I have a 4TB ssd attached which the pi also boots from. I have ~40TB of other Linux isos that have 2-drive redundancy, but no backups. If I lose those, i can always redownload.

load more comments (2 replies)
[-] rutrum@programming.dev 30 points 2 months ago

I use borg backup. It, and another tool called restic, are meant for creating encrypted backups. Further, it can create backups regularly and only backup differences. This means you could take a daily backup without making new copies of your entire library. They also allow you to, as part of compressing and encrypting, make a backup to a remote machine over ssh. I think you should start with either of those.

One provider thats built for being a cloud backup is borgbase. It can be a location you backup a borg (or restic I think) repository. There are others that are made to be easily accessed with these backup tools.

Lastly, I'll mention that borg handles making a backup, but doesn't handle the scheduling. Borgmatic is another tool that, given a yml configuration file, will perform the borgbackup commands on a schedule with the defined arguments. You could also use something like systemd/cron to run a schedule.

Personally, I use borgbackup configured in NixOS (which makes the systemd units for making daily backups) and I back up to a different computer in my house and to borgbase. I have 3 copies, 1 cloud and 2 in my home.

[-] mhzawadi@lemmy.horwood.cloud 24 points 2 months ago

There's some really good options in this thread, just remember that whatever you pick. Unless you test your backups, they are as good as not existing.

[-] redbr64@lemmy.world 6 points 2 months ago

Is there some good automated way of doing that? What would it look like, something that compares hashes?

[-] mhzawadi@lemmy.horwood.cloud 6 points 2 months ago

That very much depends on your backup of choice, that's also the point. How do you recover your backup?

Start with a manual recover a backup and unpack it, check import files open. Write down all the steps you did, how do you automate them.

[-] sugar_in_your_tea@sh.itjust.works 3 points 2 months ago* (last edited 2 months ago)

I don't trust automation for restoring from backup, so I keep the restoration process extremely simple:

  1. automate recreating services - have my podman files in a repository
  2. manually download and extract data to a standard location
  3. restart everything and verify that each service works properly

Do that once/year in a VM or something and you should be good. If things are simple enough, it shouldn't take long (well under an hour).

load more comments (5 replies)
[-] pHr34kY@lemmy.world 15 points 2 months ago* (last edited 2 months ago)

I have a job, and the office is 35km away. I get a locker in my office.

I have two backup drives, and every month or so, I will rotate them by taking one into the office and bringing the other home. I do this immediately after running a backup.

The drives are LUKS encrypted btrfs. Btrfs allows snapshots and compression. LUKS enables me to securely password protect the drive. My backup job is just a btrfs snapshot followed by an rsync command.

I don't trust cloud backups. There was an event at work where Google Cloud accidentally deleted an entire company just as I was about to start a project there.

[-] traches@sh.itjust.works 15 points 2 months ago* (last edited 2 months ago)

NAS at the parents’ house. Restic nightly job, with some plumbing scripts to automate it sensibly.

load more comments (1 replies)
[-] dataprolet@lemmy.dbzer0.com 11 points 2 months ago
load more comments (1 replies)
[-] merthyr1831@lemmy.ml 9 points 2 months ago

Rsync to a Hetzner storage box. I dont do ALL my data, just the nextcloud data. The rest is...linux ISOs... so I can redownload at my convenience.

[-] sxan@midwest.social 8 points 2 months ago

I used to say restic and b2; lately, the b2 part has become more iffy, because of scuttlebutt, but for now it's still my offsite and will remain so until and unless the situation resolves unfavorably.

Restic is the core. It supports multiple cloud providers, making configuration and use trivial. It encrypts before sending, so the destination never has access to unencrypted blobs. It does incremental backups, and supports FUSE vfs mounting of backups, making accessing historical versions of individual files extremely easy. It's OSS, and a single binary executable; IMHO it's at the top of its class, commercial or OSS.

B2 has been very good to me, and is a clear winner for this is case: writes and space are pennies a month, and it only gets more expensive if you're doing a lot of reads. The UI is straightforward and easy to use, the API is good; if it weren't for their recent legal and financial drama, I'd still unreservedly recommend them. As it is, you'd have you evaluate it yourself.

[-] foobaz@lemmy.world 2 points 2 months ago

I'm running the same setup, restic -> b2. Offsite I have a daily rclone job to pull (the diffs) from b2. Works perfectly, cost is < 1€ per month.

[-] drkt@scribe.disroot.org 8 points 2 months ago

I rsync a copy of it to a friends house every night. It's straight forward, simple and free.

[-] diegantobass@lemmy.world 5 points 2 months ago

I rsync a copy to mom's

load more comments (1 replies)
[-] Bassman1805@lemmy.world 8 points 2 months ago

The easiest offsite backup would be any cloud platform. Downside is that you aren't gonna own your own data like if you deployed your own system.

Next option is an external SSD that you leave at your work desk and take home once a week or so to update.

The most robust solution would be to find a friend or relative willing to let you set up a server in their house. Might need to cover part of their electric bill if your machine is hungry.

[-] Onomatopoeia@lemmy.cafe 7 points 2 months ago* (last edited 2 months ago)

As others have said, use tools like borg and restic.

Shop around for cloud storage with good pricing for your use-case. Many charge for different usage patterns, like restoring data or uploading.

Check out storj.io, I like their pricing - they charge for downloading/restore (IIRC), and I figure that's a cost I can live with if I need to restore.

Otherwise I keep 3 local copies of data:

1 is live, and backed up to storj.io

2 is mirrored from 1 every other week

3 is mirrored from 1 every other week, opposite 2

This works for my use-case, where I'm concerned about local failures and mistakes (and don't trust my local stores enough to use a backup tool), but my data doesn't change a lot in a week. If I were to lose 1 week of changes, it would be a minor issue. And I'm trusting my cloud backup to be good (I do test it quarterly, and do a single file restore test monthly).

This isn't an ideal (or even recommended approach), just works with the storages I currently have, and my level of trust of them.

[-] WeirdGoesPro@lemmy.dbzer0.com 7 points 2 months ago

My ratchet way of doing it is Backblaze. There is a docker container that lets you run the unlimited personal plan on Linux by emulating a windows environment. They let you set an encryption key so that they can’t access your data.

I’m sure there are a lot more professional and secure ways to do it, but my way is cheap, easy, and works.

load more comments (5 replies)
[-] hendrik@palaver.p3x.de 6 points 2 months ago* (last edited 2 months ago)

Next to paying for cloud storage, I know people who store an external hdd at their parent's or with friends. I don't do the whole backup thing for all the recorded TV shows and ripped bluerays... If my house burns down, they're gone. But that makes the amount of data a bit more manageable. And I can replace those. I currently don't have a good strategy. My data is somewhat scattered between my laptop, the NAS, an external hdd which is in a different room but not off-site, one cheap virtual server I pay for and critical things like the password manager are synced to the phone as well. Main thing I'm worried about is one of the mobile devices getting stolen so I focus on having that backed up to the NAS or synced to Nextcloud. But I should work on a solid strategy in case something happens to the NAS.

I don't think the software is a big issue. We got several good backup tools which can do incremental or full backups, schedules, encryption and whatever someone might need for backups.

[-] tburkhol@lemmy.world 5 points 2 months ago

It really depends on what your data is and how hard it would be to recreate. I keep a spare HD in a $40/year bank box & rotate it every 3 months. Most of the content is media - pictures, movies, music. Financial records would be annoying to recreate, but if there's a big enough disaster to force me to go to the off-site backups, I think that'll be the least of my troubles. Some data logging has a replica database on a VPS.

My upload speed is terrible, so I don't want to put a media library in the cloud. If I did any important daily content creation, I'd probably keep that mirrored offsite with rsync, but I feel like the spirit of an offsite backup is offline and asynchronous, so things like ransomware don't destroy your backups, too.

load more comments (1 replies)
[-] ladfrombrad@lemdro.id 3 points 2 months ago

Yeah me too, photos and videos I've recorded are the only things I'm bothered about. Backing up off-site all my arrrrr booty is redundant since I've shared it to a 2.1 ratio already and hopefully can download it again from people with larger storage than my family member has.

It's how I handle backing up those photos / videos thou. I bought them a 512GB card and shoved that in a GLi AP they have down there which I sync my DCIM folder to (app was removed from Play Store since it didn't need updating but Googles stupid policies meant it went RIP.....), and I also backup that to the old Synology NAS I handed down to them. I suppose I could use Syncthing but I like that old app since the adage if it's not broke don't fix it applies.

Along with them having Tailscale on a Pi4 (on a UPS and is their/my backup TVHeadend server) and their little N100 media box I don't even bother them with my meager photo collection and works good.

[-] irmadlad@lemmy.world 5 points 2 months ago

so if any questions here seem dumb

Not dumb. I say the same, but I have a severe inferiority complex and imposter syndrome. Most artists do.

1 local backup 1 cloud back up 1 offsite backup to my tiny house at the lake.

I use Synchthing.

[-] lightnsfw@reddthat.com 4 points 2 months ago

It's not all my data but I use backblaze for offsite backup. One of the reasons I can't drop Windows. I don't have anywhere I travel often enough to do a physical drop off and when I tried setting a file server up at my parents but they would break shit by fucking with their router every time they had an internet outage or moving it around (despite repeated being told to call me first).

[-] Scrollone@feddit.it 2 points 2 months ago

I can't stand the fact that they don't support Linux

[-] sudneo@lemm.ee 3 points 2 months ago

Objdct storage is anyway something I prefer over their app. Restic(/rustic) does the backup client side. B2 or any other storage to just save the data. This way you also have no vendor lock.

load more comments (1 replies)
[-] neidu3@sh.itjust.works 4 points 2 months ago* (last edited 2 months ago)

A huge tape archive in a mountain. It's pretty standard for geophysical data. I have some (encrypted) personal stuff on a few tapes there.

[-] cron@feddit.org 3 points 2 months ago

RClone to a cloud storage (hetzner in my case). Rclone is easy to configure and offers full encryption, even for the file names.

As the data is only uploaded once, a daily backup uploads only the added or changed files.

Just as a side note: make sure you can retrieve your data even in case your main system fails. Make sure you have all the passwords/crypto keys available.

load more comments (1 replies)
[-] bandwidthcrisis@lemmy.world 3 points 2 months ago

I use rsync.net

It's not the lowest price, but I like the flexibility of access.

For instance, I was able to run rclone on their servers to do a direct copy from OneDrive to rsync.net, 400Gb without having to go through my connection.

I can mount backups with sshfs if I want to, including the daily zfs snapshots.

[-] tuhriel@infosec.pub 3 points 2 months ago* (last edited 2 months ago)

I have a rpi4 awith an external hdd at my parents house, which I connect via a wireguard vpn, mount and decrypt the external hdd and then it triggers a restic backup to a restic-rest server as append only.

The whole thing is done via a python script

I chose the rest-server because it allows "append only", so the data can't be deleted easily from my side of the vpn.

[-] harsh3466@lemmy.ml 3 points 2 months ago

Right now I sneaker net it. I stash a luks encrypted drive in my locker at work and bring it home once a week or so to update the backup.

At some point I'm going to set up a RPI at a friend's house, but that's down the road a bit.

[-] surewhynotlem@lemmy.world 3 points 2 months ago

Idrive has built in local encryption you can enable.

load more comments (1 replies)
[-] d00phy@lemmy.world 3 points 2 months ago

My dad and I each have Synology NAS. We do a hyper sync backup from one to the other. I back up to his and vice versa. I also use syncthing to backup my plex media so he can mount it locally on his plex server.

[-] hperrin@lemmy.ca 3 points 2 months ago* (last edited 2 months ago)

I just rsync it once in a while to a home server running in my dad’s house. I want it done manually in a “pull” direction rather than a “push” in case I ever get hit with ransomware.

[-] glizzyguzzler 2 points 2 months ago* (last edited 2 months ago)

I got my parents to get a NAS box, stuck it in their basement. They need to back up their stuff anyway. I put in 2 18 TB drives (mirrored BTRFS raid1) from server part deals (peeps have said that site has jacked their prices, look for alts). They only need like 4 TB at most. I made a backup samba share for myself. It’s the cheapest symbology box possible, their software to make a samba share with a quota.

I then set up a wireguard connection on an RPi, taped that to the NAS, and wireguard to the local network with a batch script. Mount the samba share and then use restic to back up my data. It works great. Restic is encrypted, I don’t have to pay for storage monthly, their electricity is cheap af, they have backups, I keep tabs on it, everyone wins.

Next step is to go the opposite way for them, but no rush on that goal, I don’t think their basement would get totaled in a fire and I don’t think their house (other than the basement) would get totaled in a flood.

If you don’t have a friend or relative to do a box-at-their-house (peeps might be enticed with reciprocal backups), restic still fits the bill. Destination is encrypted, has simple commands to check data for validity.

Rclone crypt is not good enough. Too many issues (path length limits, password “obscured” but otherwise there, file structure preserved even if names are encrypted). On a VPS I use rclone to be a pass-through for restic to backup a small amount of data to a goog drive. Works great. Just don’t fuck with the rclone crypt for major stuff.

Lastly I do use rclone crypt to upload a copy of the restic binary to the destination, as the crypt means the binary can’t be fucked with and the binary there means that is all you need to recover the data (in addition to the restic password you stored safely!).

[-] dan@upvote.au 2 points 2 months ago* (last edited 2 months ago)

For storing the backups, I use a storage VPS. I got one from HostHatch a few years ago during Black Friday sales, with 10TB space for $10/month. Hetzner have good deals with their storage boxes, too - they offer 5TB space for $13/month if you're in the USA (you need to add VAT if you're in Europe).

A good rule of thumb is to never pay more than $5/TB/month, and during Black Friday it's closer to $2/TB/month. The LowEndTalk forum has the best Black Friday deals.

I use Borgbackup for backups, and Borgmatic to handle scheduling them. Borgbackup is a fantastic piece of software.

Borgmatic has an "append only" mode which lets you configure particular SSH keys to only be able to add data to the backup, not delete it. Even if someone/something (ransomware, malicious users, etc) gains access to your system and tries to delete the backups, they can't. Essentially, this is protection against ransomware.

This is a very common issue with other backup solutions - the client has full access to the backup, so malware on the client system could potentially delete all the backups.

I have two backup copies of most things. One copy on my home server and one copy on my storage VPS. If you do do multiple backups, Borgbackup recommend doing two separate backups rather than doing one then rsyncing it to another server.

load more comments
view more: next ›
this post was submitted on 10 May 2025
175 points (100.0% liked)

Selfhosted

50219 readers
539 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS