Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

script for removing selected files 1

Status
Not open for further replies.

4plexman

Programmer
Dec 31, 2003
5
0
0
US
I want to create a maintenance script to remove certain groups of files - like log files I want to keep. They may be in different directories.

Are there any existing scripts that I can use (there must be), or ideas?

I'm current looking at creating script that creates a list-file and processes that in a loop where each line of the list is removed.
 
The easier way is just running the following command from crontab,assuming you want to delete files with a certain extension:

find /YOURtopDIR -name "*.YOURextension" -exec rm {} \;

Just use it with care :)

Long live king Moshiach !
 
yes, I know that can be done, but I only want files (there are subdirectries that I want to leave alone) and those older than 30 days. There may be other directies and file extensions.

I'm looking at something like
find .<mydir> -xdev -type f -mtime +30 -print | egrep $EXCLUDETEXT > filename

then delete the names in filename.
 
Instead of catching the names in a file, just pipe them to xargs:

Code:
find .<mydir> -xdev -type f -mtime +30 -print | egrep $EXCLUDETEXT | xargs rm

Rod Knowlton
IBM Certified Advanced Technical Expert pSeries and AIX 5L

 
hmmm. never used xargs before.
so if I need to look in different directories I could have a number of these lines.
 
Hey 4plexman

There is a free software called TIDYSYS that works great ..
do a search on googles for TIDYSYS and download . We use it on all our RS6K system ..........

Peace Out !!!!!!
 
strange dont know why you wouldn't do do this
find <my dir> -name "*sstring*" -mtime +30 -type f -xdev
-ls -exec rm {} \;
if you have more than one dir loop it in a for
 
This is really driving me nuts:

Anytime I put an -exec clause in a find statement inside the cron I get this:

find: 0652-018 An expression term lacks a required parameter.


*****************************************************************
cron: The previous message is the standard output
and standard error of one of the cron commands.

which is bullhockey because I supplied this statement to the cron:

find ~/shist -name "*.sh_hist*" -mtime +30 -exec rm {} \;

I've found a work-around by the throwing the whole dang think into a shell script and calling the shell script but it's irritating as hell when I know it should run inside cron as is. Running AIX 4.3 could that be the problem? Any help would be appreciated.
 
First, two safety tips:

1. Use single quotes around your filespec ('*.sh_hist*'). Otherwise the command will blow up the first time it's launched in a directory containing more than one file that matches the pattern.

2. Use a fully qualified pathname for rm (/usr/bin/rm) so that nobody can sneak a nasty rm substitute somewhere earlier in your path.

As to your problem, it's probably a matter of insufficiently escaped special characters. I'd start by escaping the final backslash and semicolon, like so: \\\;

If that doesn't do it, try escaping the curly braces as well.



Rod Knowlton
IBM Certified Advanced Technical Expert pSeries and AIX 5L
CompTIA Linux+
CompTIA Security+

 
Just an FYI. Escaping the backslash and the semicolon worked like a charm. Thanks all.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top