82
Deduplication tool (lemmy.world)
submitted 4 months ago* (last edited 4 months ago) by Agility0971@lemmy.world to c/linux@lemmy.ml

I'm in the process of starting a proper backup solution however over the years I've had a few copy-paste home directory from different systems as a quick and dirty solution. Now I have to pay my technical debt and remove the duplicates. I'm looking for a deduplication tool.

  • accept a destination directory
  • source locations should be deleted after the operation
  • if files content is the same then delete the redundant copy
  • if files content is different, move and change the name to avoid name collision I tried doing it in nautilus but it does not look at the files content, only the file name. Eg if two photos have the same content but different name then it will also create a redundant copy.

Edit: Some comments suggested using btrfs' feature duperemove. This will replace the same file content with points to the same location. This is not what I intend, I intend to remove the redundant files completely.

Edit 2: Another quite cool solution is to use hardlinks. It will replace all occurances of the same data with a hardlink. Then the redundant directories can be traversed and whatever is a link can be deleted. The remaining files will be unique. I'm not going for this myself as I don't trust my self to write a bug free implementation.

you are viewing a single comment's thread
view the rest of the comments
[-] Kualk@lemm.ee 3 points 4 months ago

hardlink

Most underrated tool that is frequently installed on your system. It recognizes BTRFS. Be aware that there are multiple versions of it in the wild.

It is unattended.

https://www.man7.org/linux/man-pages/man1/hardlink.1.html

[-] Tramort@programming.dev 1 points 4 months ago

Is hardlink the same as ln without the -s switch?

I tried reading the page but it's not clear

[-] deadbeef79000@lemmy.nz 3 points 4 months ago* (last edited 4 months ago)

ln creates a hard link, ln -s creates a symlink.

So, yes, the hardlink tool effectively replaces a file's duplicates with hard links automatically, as if you'd used ln manually.

[-] Tramort@programming.dev 2 points 4 months ago

Ahh! Cool! Thanks for the explanation.

[-] Agility0971@lemmy.world 1 points 4 months ago

This will indeed save space but I don't want links either. I unique files

this post was submitted on 23 Jun 2024
82 points (100.0% liked)

Linux

48210 readers
654 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS