If I run the following code, it results in a memory leak. I monitor some internal websites, and after a few hours or days,the memory can become outrageous, sometimes running the machine out of the memory! I didn't put in a sleep or delay just so that you can see the problem, but when you run this code you'll see the memory usage just go insane. You can change the url to some bogus website without a properly formatted address, like $url="lksjfdlsjf" and you'll see the memory leak happen as well, and much quicker.
Is there a way to resolve this issue?
Thanks in advance!
Is there a way to resolve this issue?
Thanks in advance!
Code:
#!/usr/bin/perl
use DBI;
use DBD::Pg;
use POSIX;
use HTML::Parse;
use LWP::Simple;
use URI::URL;
while(1) {
$url = "[URL unfurl="true"]http://cnn.com";[/URL]
$content = get $url;
$content = parse_html($content)->format;
print "$content\n";
}