I'm working on a data entry web interface that accepts a set of fields for a technique definition. Each definition set is stored in a separate text file. I'm using CGI.pm 'save' to write the files and then trying to use.....<br><FONT FACE=monospace><br>open(IPF,"<definitionFile"
;<br>$technique = new CGI(IPF);<br></font><br>to read the data back from the files. As long as I ask for one or two uses of 'new' when reading the files, it works fine.<br><br>Problem:<br>I need to be able to read many files (up to 200) to report all techniques. It appears that CGI.pm will allow a limited number ( less that 30) of 'new' CGI objects created to read the files. Is there a destructor method I can use to get a clean slate after each read? Or, does anyone have any ideas why this is happening?<br><br>CODE:<br><FONT FACE=monospace><br>sub displayAll<br>{<br>opendir DEFDIR, "./db/defs/";<br>@techIDs = sort numerically grep(!/^\./,readdir(DEFDIR));<br>closedir DEFDIR;<br>print $query->a({-href=>"$cgiProg"},"Return to Tool Management Front Page.<HR>"
;<br>foreach $techID (@techIDs)<br> {<br> $toolFile = './db/defs/'.$techID;<br> open (IPF,"<$toolFile"
¦¦<br> &showError("Failed to open $toolFile, for display, $!"
;<br> print "Working on $toolFile<BR>\n";<br> $tool = new CGI(IPF);<font color=red># hangs here after about 20 files open</font><br> print "<table width=\"80%\" border=\"1\" bgcolor=\"white\">";<br> foreach my $item (@fields)<br> {<br> print "read parms and print here";<br> }<br> print "</table><HR>"; <br> close IPF;<br> undef $tool;<br> }<br>print $query->a({-href=>"$cgiProg"},"Return to Tool Management Front Page."
;<br>} # end displayAll<br></font><br>Thanks in advance. <p> <br><a href=mailto: > </a><br><a href= > </a><br> keep the rudder amid ship and beware the odd typo