Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

any ideas for an url checker?

Status
Not open for further replies.

jonsmith1982

Technical User
May 13, 2005
5
0
0
GB
im new to programming and im trying to make a webpage checker.
I found this script, seems to work ok, but id like to replace the website url with a variable so it can be changed. Without having to edit any script.
Also id like to add a second part to the website url,
(i.e. $website = $webpage = /members.html)


use strict; use LWP::UserAgent;

my $ua = LWP::UserAgent->new();
$ua->agent("TestBot/0.1"); # pretend we are very capable browser

my $req = HTTP::Request->new ('GET' => ' $req->header('Accept' => 'text/html');

# send request
my $res = $ua->request($req);

# check the outcome
if ($res->is_success) { print $res->content; } else { print "Error: " . $res->status_line . "\n"; }
 
ive semi solved my problem, heres some code that works better for checking urls, but i can't seem to add to variables togeather.
can any1 help?

in the if (!head ($url)); section id want ($url+$web)
also in push (@goodUrls, ($urls)); bit.
#$web = "/index.html";

use LWP::Simple;
$url ="print " Checking $url\n\n";
if (!head ($url)) {
push (@badUrls, $url);
} else {
push (@goodUrls, ($url));
}
print @goodUrls;
 
Change:
Code:
my $req = HTTP::Request->new ('GET' => '[URL unfurl="true"]http://www.google.com');[/URL] $req->header('Accept' => 'text/html');
to:
Code:
my $webpage = "[URL unfurl="true"]http://www.google.com";[/URL]
my $req = HTTP::Request->new ('GET' => "$webpage"); $req->header('Accept' => 'text/html');

- Rieekan
 
that didnt seem to work for kept asking for an explicit package name or an absolete.
i got some different code that i posted just below the original, and ive seem to ave solve my query on that post too.
thanks anyway.
 
thought id show u my little program i made,
its found some useful webpages for me.
theres an txt file with lots of extra urls sections tocheck for on a website here :-

it not bad to say this is my first programming project, since i started learning a week ago!

#!/"program files"/perl/perl
#this is my url exploit checker

use LWP::Simple;



#
print "please type url!\n";
$url = <STDIN>;
chomp ($url);
#
$Urls = " open(EX, "<exploits.txt") or die "Can't open input file: $!\n";
while ($ex = <EX>) {



chomp($ex);
$urls = "$Urls"."$ex";

print " Checking $urls\n";
if (!head ($urls)) {
push (@badUrls, $urls);
} else {
push (@goodUrls, ($urls));
}
}
close (EX);
print "try these: @goodUrls.\n";
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top