99
submitted 1 year ago by mo_ztt@lemmy.world to c/opensource@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] foonex@feddit.de 2 points 1 year ago* (last edited 1 year ago)

tl;dr Duplicity does full or incremental backups, BorgBackup only does full backups but with deduplication.

After the first backup with Duplicity, you can choose to do an incremental backup which will only store the data that has changed since the last backup. This saves time and disk space but you have to do slow full backups regularly. See question 3 of the FAQ.

BorgBackup alway does a full backup. But it divides all data into chunks or blocks (don’t know what they call it exactly at the moment). It then hashes those chunks and stores them in a content-addressed storage layer. So it basically works like Git under the hood (plus encryption). If a chunk doesn’t change between backups it‘s already there and does not have to be stored again. A backup is always a full index of the data.

With today‘s fast processors and hashing algorithms, a backup with Borg should be just as fast as an incremental backup with Duplicity. If you ask me deduplicated backups are just plain superior.

Another tool that works like BorgBackup is Restic, which I prefer. Both are good choices that I would trust with my data.

this post was submitted on 08 Aug 2023
99 points (100.0% liked)

Open Source

31236 readers
226 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS