Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations biv343 on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

ULIMIT and 'too many open files

Status
Not open for further replies.

DGRFL

Programmer
Dec 28, 2000
36
BE
Hello,

My communication server process tells me it can't open more files. Here is some more information.

System definitions : (were checked with ulimit command)
- Soft limit 1024
- Hard limit 4096

My program tries to open 260 files (argument passed on command line)

#include <stdio.h>
#include <errno.h>
#include <sys/resource.h>


int main(int argc, char *argv[])
{
int i;
int count;
FILE *aFiles[6000];
long lValue;
lValue = sysconf (_SC_OPEN_MAX);
printf(&quot;Sysconf returns %ld\n&quot;,lValue);

count = atoi(argv[1]);
for (i=0; i< count + 5; i++)
{
aFiles = fopen(&quot;files.c&quot;,&quot;r&quot;);
if (aFiles == NULL)
printf(&quot;Error opening file %d %d \n&quot;,i,errno);
}
}

The sysconf tool tells me 1024, but starting from file 253 I receive EMFILES return codes.

Somebody at Sun Support told me to adapt the OPEN_MAX line in the limits.h header file, but this include file is not included in my project, and I cannot believe this is a good solution.

Does anybody have any suggestions ?


Regards,

Luc
 
Hello,

Here is some more information. It seems that the FILE * structure as defined by Solaris 8 only allows for 255 files to be opened (they use 8 bits of memory)

This means that when a process is using more files, functions like fprintf() fopen, fseek() etc can no longer be used !!!!

Is this true, is there a workaround : if not, I need to revise the application I am working at thoroughly.

Regards,

Luc
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top