Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Fetching images from a URL 1

Status
Not open for further replies.

plotzer

Programmer
Aug 28, 2001
35
0
0
US
There are some tif images on a http server on our intranet that I would like to automatically copy to my server at 15 minute intervals. I was wondering what the best way to do this in perl would be. I've looked at a number of modules that grab data from websites, but I keep thinking there must be any easier way. I simply want to get a listing of the images in a certain directory such as:


and then copy the latest images from this directory to my server.

I was look at the FILE::COPY module but I cant seem to get it to copy the image. What do you think would be the best way:

Thanks
 
LWP, the docs are on
If you can get a directory listing wget might be the biscuit

If you have local access File::Copy should cut it, what problems did you run into with that?

--Paul

Paul
------------------------------------
Spend an hour a week on CPAN, helps cure all known programming ailments ;-)
 
Paul,

Totally forgot about using wget. That is exactly what I needed. Thanks
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top