I get core dumps when I use Perl to read very large programs using either:
or
I am trying to read files that are 300 meg + on a machine with 6 CPU's and 8GB of memory. Disk space isn't a problem.
Is there some way to keep Perl from reading the file into memory, even though I'm only asking it to process one line at a time? Is it possible to flush the memory after so many lines to avoid a core dump?
My code is simple, so I don't understand why it continues to eat up memory until it core dumps.
Here's the code within the while loop:
Code:
while(<IN>) { }
Code:
perl -ne ' ' filename
Is there some way to keep Perl from reading the file into memory, even though I'm only asking it to process one line at a time? Is it possible to flush the memory after so many lines to avoid a core dump?
My code is simple, so I don't understand why it continues to eat up memory until it core dumps.
Here's the code within the while loop:
Code:
chomp;
split(/\|/);
$line=join("\|",$_[11],$_[12],$_[13],$_[0],$_[1],$_[2],$_[3],$_[4],$_[5],$_[6],$_[7],$_[8],$_[9],$_[10]);
print "$line\n";