Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

find command taking ever to complete

Status
Not open for further replies.

tektip9191

Programmer
Oct 30, 2007
2
US
There are millions of files and the below shell script never completes. I think find is taking for ever.
How can I simplify this script for fast results.


for i in $(find `ls -d converted*` -name "*.xml")
do
ls -lrt $i | awk '{ if ( $5 == 1 ) print $0 }'
done;



Appreciate your help.

Thanks.
 
If I read your statement right, you want to find all .xml files of size 1 byte.

Check your version of find, and see what sorts of values are allowed for the -size specifier. In AIX, that syntax is -size 1c

find converted* -size 1c -name "*.xml"

It will still take a while to run on millions of files, but at least this way you're running just one process instead of creating 2 subprocesses per file processed, which on millions of files, will add up.
 
Also, how to find files with size <5 bytes with find command? Thanks.
 
My version of find only matches exact file sizes, so we'll end up with a 2-process version.

This method still only spawns 2 processes (one run of find and one run of awk) instead of an awk for each file to be checked:

find converted* -name "*.xml" -ls -type f | awk '{ if ( $7 < 5 ) print $11 }'
 
[tt]find ... -size [red]-5c[/red][/tt]

also works.


HTH,

p5wizard
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top