Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Working with Large Directories

Status
Not open for further replies.

DanNJ

MIS
Mar 19, 2003
29
US
I have a SUN blade 2000 running Solaris 8 that works with lots of files each day. I would say around 50 thousand files a day. We need to maintain at least 3 days worth of these files for research purposes if an error occurs.

There are two problems with this.

1. Our crontab fails to delete files older then three days because of the size of the directories. This causes our system too fill up and crash due to low disk space.

2. Some times there are so many files even doing an ls -la takes an hour. This makes researching problems very difficult.

Does anyone have any tips for working with very large directories. Also are there any ways to delete files or directories quickly.
 
Well, I haven't had to deal with this, but here are a few little things that wouldn't hurt.

First, you can specify a mount option [tt]noatime[/tt] in the [tt]/etc/vfstab[/tt] file. This will cause the OS to not record access times for every little access to the file (like an [tt]ls[/tt]). This will cut out a small percentage of I/O's to that volume. Creation time and modification time will still be recorded, but cutting out recording access times can help.

As far as deleting the files in the directory quickly goes, if it's ok to delete them all at once, you could try something like)...
Code:
mv /path/fatdir /path/fatdir.old ; mkdir /path/fatdir
(rm -rf /path/fatdir.old ; sync) &
This moves the directory to a new name, creates a fresh empty one to begin filling up again, then deletes the old files in the background.

Do they need to be in the same directory? Maybe change whatever's creating the files put them in different directories based on time or some other criteria.

You could maybe make a cron job to rename the directory every night so that any given directory only holds one day's worth of files. Say the files are created in [tt]/path/fatdir[/tt], the cron job could be something like...
Code:
59 23 * * * (mv /path/fatdir /path/fatdir.`date +'%Y%m%d%H%M'` ; mkdir /path/fatdir) >> fatdir.log 2>&1
Deletions will still take some time, but renaming a directory is almost instantaneous.

Hope this helps.

 
For problem 1, what is your crontab entry? Perhaps it can be fixed...

Annihilannic.
 
You might also look at this thread...

thread60-635715
Hope this helps.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top