Working with large directories
I have a SUN blade 2000 running Solaris 8 that works with lots of files each day. I would say around 50 thousand files a day. We need to maintain at least 3 days worth of these files for research purposes if an error occurs.
There are two problems with this.
1. Our crontab fails to delete files older then three days because of the size of the directories. This causes our system too fill up and crash due to low disk space.
2. Some times there are so many files even doing an ls -la takes an hour. This makes researching problems very difficult.
Does anyone have any tips for working with very large directories. Also are there any ways to delete files or directories quickly.
I have a SUN blade 2000 running Solaris 8 that works with lots of files each day. I would say around 50 thousand files a day. We need to maintain at least 3 days worth of these files for research purposes if an error occurs.
There are two problems with this.
1. Our crontab fails to delete files older then three days because of the size of the directories. This causes our system too fill up and crash due to low disk space.
2. Some times there are so many files even doing an ls -la takes an hour. This makes researching problems very difficult.
Does anyone have any tips for working with very large directories. Also are there any ways to delete files or directories quickly.