Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations derfloh on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

backup larges files (more than 12Go) to a remote host (disk)?

Status
Not open for further replies.

kat115

Programmer
Feb 6, 2004
57
FR
Hi
I have several large files to backup to a filesystem on a remote host (i don't have a tape on local host)
backup doesn't work since it wants a tape or floppy. How can I do?
My files are larger than 10Go (tar doesn't support that I guess)
Thanx for your help
kat
 
You could NFS mount the filesystem the files you want to back up to the remote host and do a simple cp -p command. You might want to compress or gzip the files before you do the copy, however, because a 10 or 12 GB file is going to take a while to go through a network unless your network is very fast.
 
Hi
I thought about the NFS idea.
I don't know rsync but i'll take a look at it.
someone told me about gtar (gnu tar i guess) maybe it does support larger files than tar (8GB max).
I have a gigabit ethernet network between the servers.
thanx!
 
If you need to use tar or gzip, you may also need to copy the file through a named pipe to get around the 2G file limits...

or in that case, maybe even netpipes.
 
Hi
Chapter11 sorry I don't understand what you mean?Could you please give an example?
thanx
kat
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top