littleIdiot2
IS-IT--Management
Hi,
I have multiple directories with multiple duplicates of files scattered throughout. I am only interested in files of a certain file extension (e.g "*.bar").
I have a unique list of the duplicated files of that extension in my location - this is my "input_list.txt":
abc123.bar
abc456.bar
def123.bar
xxx999.bar
I know that the file "abc123.bar" exists in multiple places underneath a certain point on my file system, saved into different directories. I also know that the files with duplicate file names will always be duplicated files (exactly the same) - so I do NOT need to do any sort of chksum or date/file size comparison on the files. As long as the name is the same, I want ONE copy of it, from any of the directories it is stored in.
** All I want is one copy of each unique file in my "input_list.txt" copied into one directory. **
Can you suggest an efficient way to do this? Ideally I'd like a script that takes the "input_list.txt" as input, the top level directory path to start the search from and the output directory to copy the files to.
An indication of where the script is up to would be useful, like this:
[pre]23-06-2014 13:00 Copying file 1 of 132: abc123.bar ...
23-06-2014 13:09 Copying file 2 of 132: abc456.bar ...
23-06-2014 13:13 Copying file 3 of 132: def123.bar ...
23-06-2014 13:15 Copying file 4 of 132: xxx999.bar ...[/pre]
I have multiple directories with multiple duplicates of files scattered throughout. I am only interested in files of a certain file extension (e.g "*.bar").
I have a unique list of the duplicated files of that extension in my location - this is my "input_list.txt":
abc123.bar
abc456.bar
def123.bar
xxx999.bar
I know that the file "abc123.bar" exists in multiple places underneath a certain point on my file system, saved into different directories. I also know that the files with duplicate file names will always be duplicated files (exactly the same) - so I do NOT need to do any sort of chksum or date/file size comparison on the files. As long as the name is the same, I want ONE copy of it, from any of the directories it is stored in.
** All I want is one copy of each unique file in my "input_list.txt" copied into one directory. **
Can you suggest an efficient way to do this? Ideally I'd like a script that takes the "input_list.txt" as input, the top level directory path to start the search from and the output directory to copy the files to.
An indication of where the script is up to would be useful, like this:
[pre]23-06-2014 13:00 Copying file 1 of 132: abc123.bar ...
23-06-2014 13:09 Copying file 2 of 132: abc456.bar ...
23-06-2014 13:13 Copying file 3 of 132: def123.bar ...
23-06-2014 13:15 Copying file 4 of 132: xxx999.bar ...[/pre]