Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

How to efficiently synchronize large directories to 128GB memory sticks?

Status
Not open for further replies.

heintze

Programmer
Nov 19, 2005
61
US
I've noticed that many sites that offer backup services (like dropbox) don't really do backup in the old fashioned sense. They do file synchronization.

I decided that while I like this concept, synchronizing 100s of gigabytes takes a long time over the internet (I tried one of those internet backup services and it took almost a week of continuous network activity to synchronize my documents directory with the remote directory). It also requires that I keep my notebook computer running all night and that makes me nervous because I don't believe notebook computers were designed to run continuously for multiple days. And besides that, I frequently need to unplug from both power and internet to take my notebook with me.

So I decided I prefer to buy 3 128 GB memory sticks. I keep one in my pocket at all times, one at my residence, and one off site and I rotate them. I've been experimenting with various programs like robocopy, cygwin/rsync, xcopy and unison to keep them up to date.

I think I have a problem, however. The old fashioned backup programs used the archive bit to indicate when a file had been backed up and the (incremental) backup program skips files that have the archive bit set because the archive bit indicates that that particular file has already been backed up and has not changed since it was backed up.

Well I need three different archive bits (one for each memory stick) and windows only supplies one per file. Hmmmph..

So as a result, the various backup programs like robocopy, xcopy and unison are superfluously copying many, many gigabytes of data to my memory sticks unnecessarily for files that have not changed since they were previously backed up on that memory stick.

So how do the commercial sites like google drive, sky drive, dropbox and carbonite determine if a file is out of date and needs to be copied to remote backup? Do they use the archive bit? I guess they could if they are only synchronizing with one remote directory.

What switches do I use with robocopy, xcopy or rsync so I don't have to spend hours copying files that are already up to date? Or perhaps there is some other program I should be using?


Thanks
siegfried
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top