I believe the max number of files the operating system can handle is 65000. Can anyone tell me why? What happens to the operating system if there are too many files per folder?
The limits are not by operating system, but by FILE SYSTEM. NTFS has a nearly unlimited number of files that can be stored on it. FAT16 has a limit of 65,536. FAT32 can handle 4,194,304 files.
What happens when you reach that limit? You can't store any more files and Windows might crash because it can't create any more files (and windows likes to create temp files).
ok, here's the problem. We have an old legacy system that does opens, does calculations, then closes nearly 100,000 files that reside in one folder. The application is 16 bit, runs on a Windows 2000 server with NTFS file system through terminal server. Recently we are beginning to have errors when exiting the application that certain procedures cannot be found. Since the executable has not changed since December, we are looking to environmental causes.
Anyone have any ideas?
When application want to open any file in folder, search method is sequential. When the number of files is great, the search time is big. May be timeout for function calling was reached.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.