I am writing a C program that reads in large text files. I was wondering how to determine if the text file is too big to fit into the total amount memory available to the process running the program?
Most modern operating systems (of the desktop variety and upwards) use some kind of virtual memory (along with a swap device) to give programs more memory than is physically available in the machine. The total physical RAM + swap space is what is available to all programs.
Even if your OS can answer the question, there is no guarantee that it would remain constant through the life of your program. Other programs running on the machine could claim some of that memory, meaning your program would fail to allocate memory sooner than you thought.
I suppose the next question is what are you planning to do with the file once you've read it into memory?
Thank you for the prompt reply. Actually what I am doing is reading in 2 text files as strings (say s and t), then create a 2D array of size strlen(s)strlen(t). So i guess i asked the wrong question. I should have asked how to determine if that 2D matrix along with the 2 strings will all fit in memory. Can I make use of something like RLIMIT_DATA, etc? Thanks a million for the input.
The short answer is you try and allocate it. If it succeeds, carry on and do whatever you want to do. If it fails, print an error message and quit.
If your matrix (and your code) can be reasonably made to function on subsets of the matrix (say by splitting into quarters), there is not a lot else you can do.
> Can I make use of something like RLIMIT_DATA, etc?
IIRC, these are theoretical limits based on some ideal. They're not representative of what your machine is capable of at that moment in time.
For example, my "ulimit -d" prints "unlimited", which is of course a nonsense in any real universe.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.