I'm helping with a web project that creates static HTML pages from a database. Right now, it places all these pages on one directory. So far there are about 10,000 ... with a good chance of achieving 20,000 by the end of the year.
I don't know much about Linux, but I vaguely remember from Unix training years ago, that the filesystem becomes increasingly inefficient above a certain limit.
Is this correct? I would like to argue for a different method of page generation, but I don't have facts or theories to back me up.
Any help or pointers to other sources would be very welcome.
I don't know much about Linux, but I vaguely remember from Unix training years ago, that the filesystem becomes increasingly inefficient above a certain limit.
Is this correct? I would like to argue for a different method of page generation, but I don't have facts or theories to back me up.
Any help or pointers to other sources would be very welcome.