Giter VIP home page Giter VIP logo

fdupes's People

Contributors

adrianlopezroche avatar chenyueg avatar coreyhuinker avatar davidfetter avatar doughdemon avatar falkartis avatar glensc avatar ivan-dives avatar jshcmpbll avatar kucharskim avatar maxyz avatar mvyskocil avatar pohsun-su avatar sandrotosi avatar stefanbruens avatar thiagostahlschmidt avatar tomhoover avatar valentijnscholten avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fdupes's Issues

Merge n groups of hardlinked files to a single hardlinked file

From @sandrotosi on December 20, 2015 14:5

From [email protected] on October 23, 2011 21:55:19

What steps will reproduce the problem? 1. fdupes -H -L 2. 3. What is the expected output? What do you see instead? fdupes does what I asked.

Instead:
fdupes: options --linkhard and --hardlinks are not compatible What version of the product are you using? On what operating system? fdupes 1.50-PR2 on Debian Squeeze a64 Please provide any additional information below. The only way to achieve the desired result is to run fdupes --linkhard multiple times (until it stops merging hardlinked groups of files)

This is impractical for large directories.

For example: if I run
mkdir -p test/a test/b
echo "blah" >test/a/a
echo "blah" >test/b/a
ln test/a/a test/a/b
ln test/b/a test/b/b

now a/a and a/b share 1 inode and b/a, b/b share one inode

fdupes -r -L test

now a/a a/b and b/b share 1 inode but b/a is all by itself

fdupes -r -L test

now they all share 1 inode, the desired result

Original issue: http://code.google.com/p/fdupes/issues/detail?id=22

Copied from original issue: sandrotosi/fdupes-issues#8

keep the newest of duplicates

Hi,
I'm wondering if it would be possible to add delete the oldest duplicates and keep the newest copy, search and delete by timestamp?

Questions about contributing

I am currently working on some changes to fdupes to make it work a bit better with the btrfs de-duplication process duperemove.

I have added some new arguments to limit fize considered based on file size and also skip the final byte to byte verification (btrfs will do this anyway as part of its de-duplication phase).

My question is if there is any specific guidelines I should follow to help with my pull requests being accepted, or things I should avoid?

I also made some minor tweaks to speed things up slightly and to make better use of the getfilestats function.

Feature request: option to have symlinks insted of hardlinks (--linkhard vs. "--linksyn")

From @sandrotosi on December 20, 2015 14:5

From sandro.tosi on January 19, 2014 23:07:22

Hello,
this is Debian bug report http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=658339 ```
It would be nice if fdupes not only could hardlink dupes, but there
would be an option
to do the similar thing, but with symbolic links.


_Original issue: http://code.google.com/p/fdupes/issues/detail?id=31_

_Copied from original issue: sandrotosi/fdupes-issues#10_

Feature request: ignore files copied with --reflink=auto on BTRFS

Files copied with --reflink=auto on BTRFS are already sharing the same data.
However fdupes identify them as duplicates (and in a technical sense, they are indeed duplicates).
It would be nice if we could have an option to ignore such files.

Thanks for this useful software.

Write progress messages to the tty

From @sandrotosi on December 20, 2015 14:5

From matrixhasu on August 04, 2011 23:45:09

Hello,
I'm forwarding the debian bug #625527, http://bugs.debian.org/625527 :

The progress messages ("Building file list" + twirling baton) are
written to standard error. Thus, when logging the output of
fdupes (fdupes >fdupes.log 2>&1), you either run with -q and
have no progress reports or get progress noise in the log.

Progress messages should be written to the controlling terminal
(/dev/tty).
<<<

Regards,
Sandro

Original issue: http://code.google.com/p/fdupes/issues/detail?id=19

Copied from original issue: sandrotosi/fdupes-issues#7

fdupes: reduce typing for preserve all during interactive

From @sandrotosi on December 20, 2015 14:4

From matrixhasu on October 08, 2009 21:57:09

Debian bug #383965 - http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=383965 I would like to have an option for the interactive mode to be able to
just preserve all files by pressing enter instead of typing "all" and
then pressing enter. 3 extra characters might not seem like much, but it
gets tiring when you have lots of files to review. For now, I'll use the
mouse to paste "all\n".

[1] ./home/pabs/projects/chm/samples-itolitls/-help2/TestCol.HxF
[2] ./home/pabs/patches/the_clit/clit12/z/TestCol.HxF

Set 76 of 1784, preserve files [1 - 2, all](220 bytes each):
---> Should skip to the next set after this
Set 76 of 1784, preserve files [1 - 2, all](220 bytes each):
Set 76 of 1784, preserve files [1 - 2, all](220 bytes each): all

Version: 1.40

Original issue: http://code.google.com/p/fdupes/issues/detail?id=2

Copied from original issue: sandrotosi/fdupes-issues#2

fdupes: option to replace duplicates with hard links

From @sandrotosi on December 20, 2015 14:4

From matrixhasu on October 08, 2009 22:13:05

Debian bug #284274 - http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=284274 From: Rupert Levene [email protected]

It would be nice to have the option of telling fdupes to replace
duplicate files with hard links. This would be a more symmetric
behaviour than using symlinks.


From: Javier Fernández-Sanguino Peña [email protected]

Attached is a patch to the program sources (through the use of a dpatch patch
in the Debian package) that adds a new -L / --linkhard option to fdupes. This
option will replace all duplicate files with hardlinks which is useful in
order to reduce space.

It has been tested only slightly, but the code looks (to me) about right.

Attachment: 284274_fdupes_hardlink_repace.diff

Original issue: http://code.google.com/p/fdupes/issues/detail?id=8

Copied from original issue: sandrotosi/fdupes-issues#5

fdupes: option to sort by size

From @sandrotosi on December 20, 2015 14:4

From matrixhasu on October 08, 2009 21:58:52

Debian bug #383962 - http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=383962 I would like an option to sort the list of duplicates by file size (both
ascending and descending). This would be especially useful for the
interactive mode, but it might also be useful for the listing mode.

Original issue: http://code.google.com/p/fdupes/issues/detail?id=3

Copied from original issue: sandrotosi/fdupes-issues#3

sha256 instead of md5

Hello and thanks for fdupes,

I am concerned that MD5 can be weak under some circumstances and weak to attacks (one may forge a file colliding md5 to force the deletion of something from an archive). For this reason I'm interested in enhancing fdupes with a SHA256 detection mechanism in place of MD5, which may be provided by a standalone implementation or by the OpenSSL shared library one.

Do you have any intention to merge such a pull request?

Feature request: Keep the shortest filename

Many times, duplicates come up as

file.jpeg
file1.jpeg
file2.jpeg

or

file.jpeg
file (1).jpeg
file (2).jpeg
file (3).jpeg

When recognizing these patterns, it's easy to see that the longer filenames are the ones that are unnecessary.

Solaris 10 and 11 Support for fdupes 1.6.1 - patch for md5/md5.c w.r.t. lack of endian.h

Greetings,

Attached is a simple patch that allows Solaris 10 and Solaris 11 systems to compile fdupes, by patching the file md5/md5.c to use a Solaris header to determine the byte order of the platform. I don't currently have a SPARC system (which is big endian), but the patch is quite conservative in it tests, so I hope it will work on Solaris/SPARC systems, and also OpenSolaris / illumos / ...

The patch was tested on 32-bit and 64-bit X86 systems using locally-built GCC 4.8.5 / 4.9.3 for Sun Solaris 10 Update 8, Oracle Solaris 10 Update 11, Oracle Solaris 11.2, by running "./fdupes -r .", in the build directory.

Additionally, I saw that the CHANGES mentions the removal of support for an external md5sum program, but several references to that program still exist in the manual page - no patch supplied.

Regards,
Peter Bray (Sydney, Australia)

fdupes-1.6.1-sun-endian-support.patch.txt

create an option to skip files with more than X links

I had to use this great tool to deal with consequences of some stupid design decisions from the past, which resulted in many duplicate files stored. So many that for some of them the number of duplicates was in order of hundreds of thousands. I had to write a script to "compact" this tree and replace each group of X identical files with hardlinks. I wanted to make this tool fully automatic so it can be reused until the root cause of that mess is dealt with. It would be nice if I could tell the tool to ignore any files that point to inode with more than one hardlink (in my case they are already "processed"). I guess it could be a generic option to ignore inodes with more than specified number of hard links.

To-do items from 1.50-PR2

Old to-do list from fdupes 1.50-PR2. I'm removing the TODO file from the master branch, so I've copied the contents here. A few of these have already been implemented, others not. This list is only for reference and is not meant to preclude or discourage adding any of these issues to the issue tracker.

  • A bug with -S shows wrong results.

  • A bug causes the following behavior:

    $ fdupes --symlinks testdir
    testdir/with spaces b
    testdir/with spaces a

    testdir/zero_b
    testdir/zero_a

    testdir/symlink_two
    testdir/twice_one

    $ cp testdir/two testdir/two_again
    $ fdupes --symlinks testdir
    testdir/two_again
    testdir/two
    testdir/twice_one
    testdir/symlink_two

    testdir/with spaces b
    testdir/with spaces a

    testdir/zero_b
    testdir/zero_a

    ** This is not the desired behavior. Likewise:

    $ fdupes testdir
    testdir/with spaces b
    testdir/with spaces a

    testdir/zero_b
    testdir/zero_a

    testdir/twice_one
    testdir/two

    $ fdupes --symlinks testdir
    testdir/with spaces b
    testdir/with spaces a

    testdir/zero_b
    testdir/zero_a

    testdir/symlink_two
    testdir/twice_one

  • Don't assume that stat always works.

  • Add partial checksumming where instead of MD5ing whole
    files we MD5 and compare every so many bytes, caching
    these partial results for subsequent comparisons.

  • Option -R should not have to be separated from the rest,
    such that "fdupes -dR testdir", "fdupes -d -R testdir",
    "fdupes -Rd testdir", etc., all yield the same results.

  • Add option to highlight or identify symlinked files (suggest
    using --classify to identify symlinks with @ suffix... when
    specified, files containing @ are listed using @).

  • Consider autodeletion option without user intervention.

  • Consider option to match only to files in specific directory.

  • Do a little commenting, to avoid rolling eyes and/or snickering.

  • Fix problem where MD5 collisions will result in one of the
    files not being registered (causing it to be ignored).

Free memory after a duplicate was handled

Apparently, memory for a dupe isn't freed after it was handled.

This means the system stays under memory/swap pressure even after most of a (very large) set of files was handled.

Memory should be freed earlier.

fdupes: Want option to lexically sort file names

From @sandrotosi on December 20, 2015 14:4

From matrixhasu on October 08, 2009 22:00:44

Debian bug #131764 - http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=131764 I'm wanting to use fdupes to weed out duplicate files (opposed to
different files with duplicate names) saved from pan. pan mangles
duplicate names ${first}.${last} into ${first}_(number).${last}. I would
surmise that currently (shown in behaviour) fdupes builds its file list
in raw directory order. For additional options, I would like to be able
to sort each duplicate list by lexical order, or by lexical order
without leading directory. If --omitfirst is used, the sorting should
happen BEFORE the first file is omitted from each set.

Original issue: http://code.google.com/p/fdupes/issues/detail?id=4

Copied from original issue: sandrotosi/fdupes-issues#4

Feature: size range

Hi,
it would be great if fdupes could be able to search dupes in a size range.

What do you think?

Detect dupes between two trees, not within a single one and file names

Hello,

I guess this falls under feature requests:

It would be very useful in my case if its possible to find dupes (and changs) between two file trees, but not dupes in one of them. i.e., if its possible to search /a and /b ; but not check for dupes such as /a/1.txt and /a/1.txt.bak

It would also be great if such a feature can make use of "same name" constraint to implement hardlinks, where the comparison would only occur if the two files have the same basename.

Feature: Exclude Folder From Search

A feature that would be useful to add would be the option to ignore/exclude a folder.

For example, I want to find dupes in my home folder but don't want to delve into say: ~/.wine or .config.

A possible solution could be something like: fdupes -R --exclude={.wine,.config} ~/ or for simplicity:
fdupes -R --exclude=".wine" ~/

Combine redundant code bits that only call stat()

Several functions call stat() and only return a single value from the struct stat returned:

filesize()
getdevice()
getinode()
getmtime()
getctime()

There is also a stat() call at line 327 and a function getfilestats() which calls some of the stat() functions mentioned.

The overhead from redundant function calls and system stat() calls is heavy; for the 19 files and dirs in testdir using fdupes -nrq testdir/ results in a total of 163 redundant stat() calls according to an strace log. On a different file tree with 1056 files and dirs, the excess stat() count shoots up to 38559.

I propose combining all of these functions so that each file is stat()ed only one time, with the relevant struct stat items stored all at once.

tar-style exclusion lists.

Largely suprsedes #8 .
It would save a lot of hassle to be able to use --exclude-from=… switches.
I.e. it doesn't matter for me, if it would find dupes in .bak or .err files - they are redundant and cleaned regularly.

google code issues were not migrated to github

Hey! any reason why the gcode issues where not migrated to github? I have a lot of debian bugs linked to upstream issues, and I'd like to point them to github now, but I would prefer not to create all the issues by hand (as that would mean I'd lose some of the replies/information)

could you import them please? thanks!

Sometimes fdupes report files pointing to same inode as duplicate even when -H parameter isn't used.

From @sandrotosi on December 20, 2015 14:5

From sandro.tosi on April 27, 2012 18:54:25

Hello,
I'm forwarding the Debian bug as reported at: http://bugs.debian.org/670631 >>>
Try runnig these commands:

mkdir 1 2 3
echo AAAAAAAAAAAAAAA > 1/A1
echo AAAAAAAAAAAAAAA > 2/A2
ln 1/A1 3/A3
echo BBBBBBBBBBBBBBB > 1/B1
echo BBBBBBBBBBBBBBB > 2/B2
ln 2/B2 3/B3
echo CCCCCCCCCCCCCCC > 1/C1
ln 1/C1 2/C2
echo CCCCCCCCCCCCCCC > 3/C3
ls -li [123]/[ABC][123]
fdupes -r 1 2 3
fdupes -r 1 3 2
fdupes -r 2 1 3
fdupes -r 2 3 1
fdupes -r 3 1 2
fdupes -r 3 2 1

When you run the fdupes commands, it should report two duplicate Ax files, two duplicate Bx files and two duplicate Cx files.
Not three files, as the third file is a link. But sometimes this isn't the case:

fdupes -r 1 3 2

3/A3
2/A2

1/B1
2/B2

1/C1
3/C3
2/C2

ls -li [123]/[ABC][123]

90180 -rw-r--r-- 2 root root 16 Apr 27 14:53 1/A1
90182 -rw-r--r-- 1 root root 16 Apr 27 14:53 1/B1
90185 -rw-r--r-- 2 root root 16 Apr 27 14:53 1/C1
90181 -rw-r--r-- 1 root root 16 Apr 27 14:53 2/A2
90184 -rw-r--r-- 2 root root 16 Apr 27 14:53 2/B2
90185 -rw-r--r-- 2 root root 16 Apr 27 14:53 2/C2
90180 -rw-r--r-- 2 root root 16 Apr 27 14:53 3/A3
90184 -rw-r--r-- 2 root root 16 Apr 27 14:53 3/B3
90186 -rw-r--r-- 1 root root 16 Apr 27 14:53 3/C3

As seen above, fdupes report "1/C1" and "2/C2" ad duplicates, even though they point to the same inode.
<<<

Regards,
Sandro

Original issue: http://code.google.com/p/fdupes/issues/detail?id=24

Copied from original issue: sandrotosi/fdupes-issues#9

fdupes - very slow on huge file types

When searching for duplicates on very large file sizes the hashing algorithm is too slow.
There should be a command line option to use only file size and date for comparing.

Or at least an faster hashing algorithm?

On my Core i7 with 8GB Ram it takes several (10 - 50) seconds per file if they are several Gb large.

Replace duplicate files with hard links option is missing.

Where did it go? See this post for proof that it existed.

* debian/patches/50_bts284274_hardlinkreplace.dpatch
    - added -L / --linkhard to make fdupes replace files with hardlinks. Also
      update the manual page; thanks to Rupert Levene for the report and to
      Javier Fernández-Sanguino Peña for the patch; Closes: #284274

please respect access permissions for -L

From @sandrotosi on December 20, 2015 14:5

From matrixhasu on August 01, 2011 15:36:46

I'm forwarding the debian bug 635158, http://bugs.debian.org/635158 :

fdupes should respect uid, gid and access permissions before
replacing files by hard links. Sample session:

% umask 0022; echo hello >a; cp -p a b; ln b c; chmod go-r b; ls -li
total 12
27394110 -rw-r--r-- 1 harri harri 6 Jul 23 11:36 a
27394111 -rw------- 2 harri harri 6 Jul 23 11:36 b
27394111 -rw------- 2 harri harri 6 Jul 23 11:36 c
% fdupes -L .
[+] ./c
[h] ./a
[h] ./b

% ls -li
total 12
27394111 -rw------- 3 harri harri 6 Jul 23 11:36 a
27394111 -rw------- 3 harri harri 6 Jul 23 11:36 b
27394111 -rw------- 3 harri harri 6 Jul 23 11:36 c

See how a world readable file became unreadable for anybody
but the owner?

Since "a" has different access permissions it shouldn't
have been replaced by a hard link to b. Same goes for
identical files with different owners. fdupes is loosing
too much information here.
<<<

Original issue: http://code.google.com/p/fdupes/issues/detail?id=18

Copied from original issue: sandrotosi/fdupes-issues#6

Feature request: Find all copies of a reference file in a directory tree.

It would be nice to be able to do something like:
fdupes --reffile <file> <directory>
And get a list of files in the directory that are duplicates of the specified file (in particular when the file is not itself in that directory).

This would be really useful when dealing with filesystem snapshots and trying to delete a file from all snapshots as well as the main filesystem (to free up space, or to just make sure it's actually gone).

Feature : save a md5sum/tree file for future compare

It would be good to save the hash/parse/analyze information of a specific fdupes run, in order to compare later this "virtual"files tree with a real file tree.

For example, I have a huge photo database, I can run a long fdupes analyze and store the informations in a file. Then if later in my old HDD I found some photos, i could just run fdupes on these photo with an import of the "virtual" huge database to compare from...

Option --order in 1.51

My intent was to keep the first file in alphabetic order of any duplicates. So I installed fdupes using macports and got

$ /opt/local/bin/fdupes --version
fdupes 1.51

I found no option for sorting, neither in man page nor in --help output. So I checked back here and found an --order option in source code. So I downloaded sources, build and used "--order name" option successfully.

Two issues:

Feature Request: Fuzzy file name match

Would be nice to have an option to find dups by fuzzy file name match.

When the files don't match perfectly on md5, this would provide an opportunity to (at least) view the files that "look" similar. One use case for this is music files obtained from multiple sources having similar names, but different checksum.

Any simple algorithm could be used for the fuzzy match.

fdupes: option to not traverse filesystems

From @sandrotosi on December 20, 2015 14:4

From matrixhasu on October 08, 2009 21:55:20

Debian bug #496472 - http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=496472 It would be nice if fdupes had an option to not traverse filesystem
boundaries, like the -xdev argument to find. What version of the product are you using? On what operating system? Version: 1.50-PR2

Original issue: http://code.google.com/p/fdupes/issues/detail?id=1

Copied from original issue: sandrotosi/fdupes-issues#1

delete files progressively, not just at the end of the scan

fdupes sometimes take a long time to run, in particular if you have a lot of files and their size is big, and if you interrupt fdupes while it still parses the files list for dupes, you basically lost all the processing done so far.

it would be ideal to have an option to delete dupes as fdupes find them, so that every time you ctrl+c at least you've remove a part of the duplicates

fdupes not find and change all large files

fdupes not find all for openSUSE RPM (many large levels). It works for other files but skip some directories.

%fdupes -s %{buildroot}%{_prefix}

I see many files and directories are operated and fixed, about 90% is good (too some large directories (levels).

...
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/sirr.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/sirr.wav (was /usr/s
hare/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/sirr.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/gate.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/gate.wav (was /usr/s
hare/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/gate.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/booom.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/booom.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/booom.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/roaaar.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/roaaar.wav (was /u
sr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/roaaar.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/BD2K3_Switch_1.wav -> ../../BD2K3/sounds/BD2K3_Switch_1.wav (was /usr/share/rocksndiamonds/levels/
BD2K3/sounds/BD2K3_Switch_1.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/rhythmloop.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/rhythmloop.wav
 (was /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/rhythmloop.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/quirk.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/quirk.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/quirk.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/bug.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/bug.wav (was /usr/sha
re/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/bug.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/zonkdown.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/zonkdown.wav (wa
s /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/zonkdown.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/buing.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/buing.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/buing.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/knack.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/knack.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/knack.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/BD2K3_Timegate.wav -> ../../BD2K3/sounds/BD2K3_Timegate.wav (was /usr/share/rocksndiamonds/levels/
BD2K3/sounds/BD2K3_Timegate.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/zonkpush.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/zonkpush.wav (wa
s /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/zonkpush.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/base.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/base.wav (was /usr/s
hare/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/base.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/gong.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/gong.wav (was /usr/s
hare/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/gong.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/oeffnen.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/oeffnen.wav (was 
/usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/oeffnen.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/klopf.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/klopf.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/klopf.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/schlurf.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/schlurf.wav (was 
/usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/schlurf.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/exit.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/exit.wav (was /usr/s
hare/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/exit.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/infotron.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/infotron.wav (wa
s /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/infotron.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/njam.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/njam.wav (was /usr/s
hare/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/njam.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/pusch.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/pusch.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/pusch.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/klumpf.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/klumpf.wav (was /u
sr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/klumpf.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/amoebe.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/amoebe.wav (was /u
sr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/amoebe.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/pling.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/pling.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/pling.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/halloffame.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/halloffame.wav
 (was /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/halloffame.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/knurk.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/knurk.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/knurk.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/zisch.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/zisch.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/zisch.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/miep.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/miep.wav (was /usr/s
hare/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/miep.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/BD2K3_Null.wav -> ../../BD2K3/sounds/BD2K3_Null.wav (was /usr/share/rocksndiamonds/levels/BD2K3/so
unds/BD2K3_Null.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/deng.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/deng.wav (was /usr/s
hare/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/deng.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/BD2K3_Pipe_Swoop.wav -> ../../BD2K3/sounds/BD2K3_Pipe_Swoop.wav (was /usr/share/rocksndiamonds/lev
els/BD2K3/sounds/BD2K3_Pipe_Swoop.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/empty.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/empty.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/empty.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/blurb.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/blurb.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/blurb.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/klapper.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/klapper.wav (was 
/usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/klapper.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/lachen.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/lachen.wav (was /u
sr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/lachen.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/roehr.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/roehr.wav (was /usr
/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/roehr.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Missions/sounds/autsch.wav -> ../../Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/autsch.wav (was /u
sr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_emanuel_schmieg/sounds/autsch.wav)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Negundo_World_3/Dangerous_Cave_Edition/template.level -> ../../Negundo_World_2/template.level (was /usr/share/rock
sndiamonds/levels/Negundo_World_2/template.level)
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Walpurgis Collection/Walpurgis World/sounds/BD2K3_Pipe_Walk.wav -> BD2K3/sounds/BD2K3_Pipe_Walk.wav (was /usr/shar
e/rocksndiamonds/levels/BD2K3/sounds/BD2K3_Pipe_Walk.wav)
[  168s] rm: cannot remove './/usr/share/rocksndiamonds/levels/Walpurgis': No such file or directory
[  168s] rm: cannot remove 'Collection/Walpurgis': No such file or directory
[  168s] rm: cannot remove 'World/sounds/BD2K3_Pipe_Walk.wav': No such file or directory
[  168s] /usr/lib/rpm/brp-suse.d/brp-25-symlink: line 159: test: too many arguments
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/Walpurgis Collection/Walpurgis Gardens/sounds/blank.wav -> Earth_Shaker_Collection/Earth Shaker Explosions/music/e
s_blank.wav (was /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/music/es_blank.wav)
[  168s] rm: cannot remove './/usr/share/rocksndiamonds/levels/Walpurgis': No such file or directory
[  168s] rm: cannot remove 'Collection/Walpurgis': No such file or directory
[  168s] rm: cannot remove 'Gardens/sounds/blank.wav': No such file or directory
[  168s] INFO: relinking /usr/share/rocksndiamonds/levels/rnd_the_h_world/hwld_dceos/tapes/001.tape -> 002.tape (was /usr/share/rocksndiamonds/levels/rnd_the_h_world/hwld_d
ceos/tapes/002.tape)

But later rpmlint report for not fixed files in more directories

[  595s] rocksndiamonds-data.noarch: E: files-duplicated-waste (Badness: 100) 2040663
[  595s] Your package contains duplicated files that are not hard- or symlinks. You
[  595s] should use the %fdupes macro to link the files to one.
[  595s] 
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_tune6.wav /usr/share/rock
sndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_tune6.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/graphics/es_potion.pcx /usr/share
/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/graphics/es_potion.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_tune3.wav /usr/share/ro
cksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_tune3.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_tune7.wav /usr/share/rock
sndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_tune7.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_tune1.wav /usr/share/rock
sndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_tune1.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_earth.wav /usr/share/ro
cksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_earth.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/Level.pcx /usr/shar
e/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/Level.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/Generic4.pcx /usr/s
hare/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/Generic4.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/Generic2.pcx /usr/s
hare/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/Generic2.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_rock_drop.wav /usr/share/
rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_rock_drop.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/ABToons.pcx /usr/sh
are/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/ABToons.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_win.wav /usr/share/rocksn
diamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_win.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/Generic3.pcx /usr/shar
e/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/Generic3.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_gem_drop.wav /usr/share/r
ocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_gem_drop.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/Elements4.pcx /usr/sha
re/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/Elements4.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_tune8.wav /usr/share/rock
sndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_tune8.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_potion.wav /usr/share/r
ocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_potion.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_rock_crumble.wav /usr/sha
re/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_rock_crumble.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_fire.wav /usr/share/roc
ksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_fire.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_blank.wav /usr/share/ro
cksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/music/es_blank.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/Elements3.pcx /usr/
share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/Elements3.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_tune4.wav /usr/share/ro
cksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_tune4.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/BD2K3/BD2K3_Template.level /usr/share/rocksndiamonds/levels/Contributions/Contribut
ions_2004/rnd_rado_negundo_iii/BD2K3_Template.level
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/RocksMore.pcx /usr/sha
re/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/RocksMore.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/graphics/es_beans.pcx /usr/share/ro
cksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/graphics/es_beans.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/graphics/es_music.pcx /usr/share/ro
cksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/graphics/es_music.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_tune5.wav /usr/share/ro
cksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_tune5.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_tune2.wav /usr/share/ro
cksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_tune2.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_all_gems.wav /usr/share
/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_all_gems.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/Elements2.pcx /usr/
share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/Elements2.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/Generic1.pcx /usr/shar
e/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/Generic1.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_fire_bubbled.wav /usr/sha
re/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_fire_bubbled.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/Back1.pcx /usr/shar
e/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/Back1.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_death.wav /usr/share/rock
sndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_death.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/Smily.pcx /usr/share/r
ocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/Smily.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/graphics/es_hero.pcx /usr/share/roc
ksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/graphics/es_hero.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_teleport.wav /usr/share
/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_teleport.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/RocksFontBig.pcx /u
sr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/RocksFontBig.pcx
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_gem.wav /usr/share/rock
sndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_gem.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_monitor.wav /usr/share/ro
cksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_monitor.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Earth_Shaker_Collection/Earth Shaker Explosions/sounds/es_bean.wav /usr/share/rocks
ndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo/sounds/es_bean.wav
[  595s] rocksndiamonds-data.noarch: W: files-duplicate /usr/share/rocksndiamonds/levels/Contributions/Contributions_2004/rnd_rado_negundo_v/graphics/Back2.pcx /usr/share/r
ocksndiamonds/levels/Contributions/Contributions_2003/rnd_andreas_buschbeck/graphics/Back2.pcx
[  595s] 2 packages and 0 specfiles checked; 1 errors, 43 warnings.

Please correct this for large data.

Feature request: implement reverse order when sorting on filename too

Being a linux newbee, fdupes is the first program I compile, and I did so specifically to have the new sorting option -o --order=BY. My use case is that I want to delete all files in one folder that are duplicates of a file in another folder, and not delete some of them from the first folder, and delete others from the second folder.

I first used Ubuntu software center to install fdupes, version 1.5.1, and it presented the duplicates not consistently ordered by path, but (as I now know) by time. This required me to be very concentrated to press the right key 1 or 2 for each set of duplicates.

With version 1.6.1, now freshly compiled on my pc, I can relay on always having to press the same key to delete files in always the same folder. But I still have to press the keys many times.

If I could influence the sort order to be reversed, I could get all duplicates in the folder I want to delete from, to be number 1 (because sorting on file name is in fact sorting on path name first), and so I could use the -N --noprompt option.

I used to program in C long time ago, and the reverse order could be as simple as returning -strcmp(...)

fdupes misses duplicates when operating on a smaller set of directories

I tried two things:

$ fdupes -r data2/ data3/

then I thought that it would be a nice to know the duplicates that are only on data3. Thus I started

$ fdupes -r data3/

While this was running I thought that I could derive the output of the second call from the first result by filtering out data2 results. Finally I completed both fdupes -r data3 and filtering out data2 and compared the results. I expected differences in the order of groups and order of files within groups. But interestingly fdupes -r data3/ missed several files in groups of equal files. I processed a HFS+ partition with fdupes on Ubuntu. The data was created on a Mac and contains lots of equal .xml and .nib and .lproj and other files with names starting with ._ .

Feature request: delete file then make symlink or hardlink to reference file

I would love an option to delete file then make symlink to reference file.
For example,
/some/dir/file1
/some/dir/file2

fdupes -d --symlinkfirst /some/dir

/some/dir/file1
/some/dir/file2 > /some/dir/file1

And option to move all first duplicated files to a directory then point symlinks to the files in the directory,
For example:
/some/dir/file1
/some/dir/file2
fdupes --refdir=/some/ref_dir /some/dir
/some/ref_dir/file1
/some/dir/file1 > /some/ref_dir/file1
/some/dir/file2 > /some/ref_dir/file1

Feature request.

How about a "--report" or "--test" that does the dedupe crawl but changes nothing, but reports on how much disk would be saved.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.