Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Working with large directories

Status
Not open for further replies.

DanNJ

MIS
Mar 19, 2003
29
US
Working with large directories

I have a SUN blade 2000 running Solaris 8 that works with lots of files each day. I would say around 50 thousand files a day. We need to maintain at least 3 days worth of these files for research purposes if an error occurs.

There are two problems with this.

1. Our crontab fails to delete files older then three days because of the size of the directories. This causes our system too fill up and crash due to low disk space.

2. Some times there are so many files even doing an ls -la takes an hour. This makes researching problems very difficult.

Does anyone have any tips for working with very large directories. Also are there any ways to delete files or directories quickly.
 
put them as close as you can to root so the name is chached, to long names will not fit in cache,run a crontab to cycle a search for a non-existing filename thru the directory so you name cache is updated and kept.
Check vmstat -s for cache hits and up limits in /etc/system
Google "name chache lookup hits" and change system parameters as needed
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top