Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations biv343 on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Need recursive ftp tips 1

Status
Not open for further replies.

TimeTraveler

IS-IT--Management
Nov 20, 2001
99
US
I'd like to recursively ftp my website on a server to a single .zip or .tar file locally. I don't have the server space to tar the site first. My site has drifted from it's online version from the local version and I want to archive the whole site in one swoop, including images, slurped javascript files (*-js.txt) and whatnot.

Ideas?

Sean aka TimeTraveler

Visit
 
Thanks! I checked out the page. It looks like the right tool, but I'm not getting the resusts I want. The output I'm getting is a single file with a bunch of ftp links containing my id/pw combo followed by the url to the file.

wget ftp://id:pw@ftp.domain.ext/dir/

yields this:

--10:38:10-- ftp://id:pw@ftp.domain.com:21/dir/
=> `.listing'
Connecting to ftp.domain.com:21... connected!
Logging in as id ... Logged in!
==> TYPE I ... done. ==> CWD dir ... done.
==> PORT ... done. ==> LIST done.

0K ->

10:38:11 (231.45 KB/s) - `.listing' saved [237]

Removed `.listing'.
Wrote HTML-ized index to `index.html' [820].

Where the file contains:

<!DOCTYPE HTML PUBLIC &quot;-//IETF//DTD HTML 2.0//EN&quot;>
<html>
<head>
<title>Index of /dir on ftp.id.com:21</title>
</head>
<body>
<h1>Index of /dir on ftp.id.com:21</h1>
<hr>
<pre>
2003 Jul 30 07:46 File <a href=&quot;ftp://id:pw@ftp.id.com:21/dir/home.htm&quot;>home.htm</a> (3,549 bytes)
2003 Jul 30 07:47 File <a href=&quot;ftp://id:pw@ftp.id.com:21/dir/home.html&quot;>home.html</a> (3,549 bytes)
2003 Jul 30 07:47 File <a href=&quot;ftp://id:pw@ftp.id.com:21/dir/index.htm&quot;>index.htm</a> (3,543 bytes)
2003 Jul 30 07:47 File <a href=&quot;ftp://id:pw@ftp.id.com:21/dir/index.html&quot;>index.html</a> (3,549 byte
s)
</pre>
</body>
</html>

(I s/r'd the id and pw for privacy obviously)

Sean aka TimeTraveler

Visit
 
Hmm, I would have thought that you would have needed to (basicaly spidered) -r recursed your website back to your local machine? Im not sure of the exact syntax but it would be something like: wget -r -l6
Where -r is recurse through the site directories and -l6 (ell+six)was recurse to 6 layers only, probably need to confirm that syntax (dont trust my memory).


Laurie.
 
Thanks Laurie! The -r was what I was missing, and it worked with the ftp. The documentation on the -r options listed with -help and online made it seem oriented for link following / spidering and not directory recursion.

Thanks again. Star given.

Sean aka Time Traveler



Visit
 
Um, I've been working on this now for a couple hours, and wget is definitely a finicky utility. The idea that someone can simply download their site using this is not true.

wget bombs and doesn't grab what I want it to grab. I've got all the various settings going and it still bombs. If using the -I or -X switches, it basically obliterates the -r option, and really if I had time to go through and list all the subdirs I wanted I could have ftp'd them manually myself.

I defintely would not recommend wget to anyone. Any other ideas/utilties? Even Windows based?

Sean






Visit
 
Hmmm .. thats a shame, I've had good results with wget but then again its only been to grab a site with probably no more than two or three layers.

On the commercial side I use TeleportPro But this has a price tag of $40 to licence and will only pull a maximum of 500 files in shareware mode (if unlicenced).

But its easy to use, WIN32 GUI based and &quot;in my opinion&quot; gets the job done, (but I said the same about wget!) .... :¬)


Laurie.
 
Thanks for the help. I actually found out about a command called scp (not the GNU version) and it worked on the second try. (The documentation wasn't clear that a local directory had to be specified.)

scp -r user@domain:remotedirectory localdirectory

Sean


Visit
 
Well yes! why didnt I think of that as I use it every day not thinking, I should have known about the recurse flag -r in scp .. Doh!

You are aware of of pscp (comes with PuTTy) if you need to do the same from a Win32 client.

I love *nix 101 ways to do the same thing :¬)

Laurie.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top