Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

grep based on file date 1

Status
Not open for further replies.

wilmington

Technical User
Nov 21, 2002
3
US
I need to grep for strings in a group of files, but only if the last assess time is within the past 24 hours.

I looked at Find, but the problem is it runs down subdirectories and there are a lot of them.

How would you script this?
 
for file in 'find -type f ..more options...'
do grep ...your..string.. $file
done -----------
when they don't ask you anymore, where they are come from, and they don't tell you anymore, where they go ... you'r getting older !
 
you can stop find from recursing:

Code:
find . \( -type d ! -name . -prune \) -o \( -type f -mtime +1 -print \)

Cheers, Neil
 
sure toolkit, but why not a simple ls
for file in 'ls -d yourdir/*'
do [ -d $file ] && continue
grep ..... $file
done -----------
when they don't ask you anymore, where they are come from, and they don't tell you anymore, where they go ... you'r getting older !
 
For directories containing a large number of files, the shell expansion '*' can cause ls to fail. 'find' does not suffer from this. Also, try benchmarking the 'for file in' method against 'find' as the number of files in a directory increases. find is better. Also, find can be used for simple one liners:

find files that contain a certain string:
Code:
find . -type f | xargs grep -l "foo"

find files and move to a new directory:
Code:
find . -type f -name "*old" | xargs -i mv {} OLD_FILES/

etc...

Cheers, Neil :)
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top