173
submitted 1 year ago* (last edited 1 year ago) by mfat@lemdro.id to c/linux@lemmy.ml

Can you please share your backup strategies for linux? I'm curious to know what tools you use and why?How do you automate/schedule backups? Which files/folders you back up? What is your prefered hardware/cloud storage and how do you manage storage space?

(page 2) 48 comments
sorted by: hot top controversial new old
[-] smeg@feddit.uk 2 points 1 year ago

The important stuff is in cloud storage using Cryptomator (I'm hoping that rclone should make sync simple), I should probably set up time shift in case things do go wrong

[-] bonegakrejg@lemmy.ml 2 points 1 year ago

Rclone makes Cryptomator redundant since it has built in encryption, if you want it simple.

[-] smeg@feddit.uk 1 points 1 year ago

Ooh that's interesting to know! Though I do make use of Cryptomator on my phone too, is rclone on Android in a useable state?

[-] bonegakrejg@lemmy.ml 2 points 1 year ago

Yup, either through Termux or Round Sync

[-] smeg@feddit.uk 1 points 1 year ago

Nice, I'll have an investigate, thanks

[-] traches@sh.itjust.works 2 points 1 year ago* (last edited 1 year ago)

Software & Services:

Destinations:

  • Local raspberry pi with external hdd, running restic REST server
  • RAID 1 NAS at parents' house, connected via tailscale, also running restic REST

I've been meaning to set up a drive rotation for the local backup so I always have one offline in case of ransomware, but I haven't gotten to it.

Edit: For the backup set I back up pretty much everything. I'm not paying per gig, though.

[-] Minty95@lemm.ee 2 points 1 year ago* (last edited 1 year ago)

Timeshift for the system, works perfectly, if you screw up the system, bad update for instance just start it, and you'll be back up running in less than ten minutes. Simple Cron backups for data, documents etc, just in case you delete a folder, document, image etc . Both of these options to a second internal HD

[-] seaQueue@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

I leverage btrfs or ZFS snapshots. I take rolling system level snapshots on a schedule (daily, weekly, monthly and separately before any package upgrades or installs) and user data snapshots every couple of hours. Then I use btrbk to sync those snapshots to an external drive at least once a week. When I have all of my networking gear and home services setup I also sync all of this to storage on my NAS. Any hosts on the network keep rolling snapshots stored on the NAS as well.

Important data also gets shoveled into a B2 bucket and/or Google drive if I need to be able to access it from a phone.

I keep snapshots small by splitting data up into well defined subvolumes, anything that can be reacquired from the cloud (downloads, package caches, steam libraries, movies, music, etc) isn't included in the backup strategy. If I download something and it's hard to find or important I move it out of downloads and into a location that is covered by my backups.

[-] drwho@beehaw.org 2 points 1 year ago

All of my servers make local dumps of their databases and config files to directories owned by unprivileged users. This includes file paths, permissions, and ownerships (so I know how to put them back).

My primary research server at home uses rsync to pull copies of those local backups from my servers.

My primary research server uses Restic to make a daily incremental backup to Backblaze's B2 service.

[-] nichtburningturtle@feddit.org 2 points 1 year ago

I have my important folders synced to my Nextcloud and create nightly snapshots of that to a different drive using borg.

One thing I still need to do, is offsite encrypted backups using rsync.

[-] xlash123@sh.itjust.works 2 points 1 year ago

For my home server, I use Restic and a cronjob to weekly take snapshots of all my services. It then gets synced to a Backblaze B2 bucket (at $6/TB/mo). It's pretty neat, only saving the difference between the previous and current snapshot, removes older snapshots, and encrypts everything.

[-] MonkderVierte@lemmy.ml 2 points 1 year ago* (last edited 1 year ago)

Constant work in progress.

[-] shadowtofu@discuss.tchncs.de 2 points 1 year ago

I use syncthing to sync almost everything across my computer, laptop (occasional usage), server (RAID1), old laptop (powered up once every month or so), and a few other devices (that only get a small subset of my data, though). On the computer, laptop, and server, I have btrfs snapshots (snapper). Overall, this works very well, I always have 4+ copies of my data in 2+ geographical locations.

[-] savvywolf@pawb.social 2 points 1 year ago

Firstly, for my dotfiles, I use home-manager. I keep the config on my git server and in theory I can pull it down and set up a system the way I like it.

In terms of backups, I use Pika to backup my home directory to my hard disk every day, so I can, in theory, pull back files I delete.

I also push a core selection of my files to my server using Pika, just in case my house burns down. Likewise, I pull backups from my server to my desktop (again with Pika) in case Linode starts messing me about.

I also have a 2TiB ssd I keep in a strongbox and some cloud storage which I push bigger things to sporadically.

I also take occasional data exports from online services I use. Because hey, Google or Discord can ban you at any time for no reason. :P

[-] capital@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

restic -> Wasabi, automated with shell script and cron. Uses an include list to tell it what paths to back up.

Script has Pushover credentials to send me backup alerts. Parses restic log to tell me how much was backed up, removed, success/failure of backup, and current repo size.

To be added: a periodic restore of a random file to have its hash compared to the current version of the file (will happen right after backup, unlikely to have changed in my workload), which will be subsequently deleted, and alert sent letting me know how the restore test went.

[-] vortexal@lemmy.ml 2 points 1 year ago

The only thing I use as a backup is a Live CD that's mounted to a USB thumb drive.

I used to use Timeshift but the one time I needed it, it didn't work for some reason. It also had a problem of making my PC temporarily unusable while it was making a backup, so I didn't enable it when I had to reinstall Linux Mint.

load more comments (1 replies)
[-] krakenfury@lemmy.sdf.org 2 points 1 year ago

I sync important files to s3 from a folder with awscli. Dot files and projects are in a private git repos. That's it.

If I maintained a server, I would do something more sophisticated, but installation is so dead simple these days that I could get a daily driver in working order very quickly.

[-] potentiallynotfelix@lemmy.fish 2 points 1 year ago

If I feel like it, I might use DD to clone my drive and put in on a hard drive. Usually I don't back up, though.

[-] gerdesj@lemmy.ml 2 points 1 year ago

You have loads of options but you need to also start from ... "what if". Work out how important your data really is. Take another look and ask the kids and others if they give a toss. You might find that no one cares about your photo collection in which case if your phone dies ... who cares? If you do care then sync them to a PC or laptop.

Perhaps take a look at this - https://www.veeam.com/products/free/linux.html its free for a few systems.

[-] Veraxis@lemmy.world 1 points 1 year ago

For system files/configuration on my machines, timeshift set to run once a week.

For family photos and shared files, I built a pair of SFTP servers made from old HP thin-client PCs at two different geographic locations which automatically sync to each other once a day via cron job using vsftpd and lftp. Each one has both an NVMe and SATA SSD which run in a software RAID 1 configuration.

For any other files, a second local server also using vsftpd and two SSDs in USB enclosures. I manually back them up using rsync on an irregular basis.

Timeshift for configs to a locally attached drive. Home partition to cloud with rsync

[-] spacemanspiffy@lemmy.world 1 points 1 year ago

Dotfiles are handled by GNU Stow and git. I have this on all my devices.

Projects like in git.

Media is periodically rsynced from my server to an external drive.

Been meaning to put all my docker-composes into git as well...

I don't back up too much else.

[-] clif@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Internal RAID1 as first line of defense. Rsync to external drives where at least one is always offsite as second. Rclone to cloud storage for my most important data as the third.

Backups 2 and 3 are manual but I have reminders set and do it about once a month. I don't accrue much new data that I can't easily replace so that's fine for me.

[-] fmstrat@lemmy.nowsci.com 1 points 1 year ago

All important files go in /data.

/data is ZFS, snapped and sent to NAS regularly

Every time I change a setting, it gets added to a dconf script. Every time I install software, I write a script.

Dotfiles git repo for home directory.

With that, I can spin up a fresh machine in minutes with scripts.

[-] Peasley@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

I built a backup server out of my old desktop, running Ubuntu and ZFS

I have a dataset for each of my computers and i back them up to the corresponding datasets in the zfs pool on the server semi-regularly. The zfs pool has enough disks for some redundancy, so i can handle occasional drive failures. My other computers run arbitrary filesystems (ext4, btrfs, rarely ntfs)

the only problem with my current setup is that if there is file degradation on my workstation that i dont notice, it might get backed up to the server by mistake. then a degraded file might overwrite a non-degraded backup. to avoid this, i generally dont overwrite files when i backup. since 90% of my data is pictures, it's not a big deal since they dont change

Someday i'd like to set up proxmox and virtualize everything, and i'd also like to set up something offsite i could zfs-send to as a second backup

load more comments
view more: ‹ prev next ›
this post was submitted on 15 Oct 2024
173 points (100.0% liked)

Linux

59242 readers
604 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 6 years ago
MODERATORS