I've gotten pretty clsoe to caught up on all the priority list items here so I'm starting to think about preventative maintinance.
I am doing weekly full + daily differential backups to media removed from the building. I know the backup isnt worth the processing time it took to make it if the data can't be restored. And I've had success restoring a file or folder here and there as necessary. But I have yet to have a failure or other instance when I needed to restore a large amount of information.
We are a small business here and still have managed to generate enough data that its unreasonable to think eyeballing it would be good enough to say ALL the data was recovered.
I want to do a large scale test run on our file server sometime soon but would like to come up with a good strategy for proving that I was able to recover everything.
If you were doing a test run to prove your backup system how would you verify you recovered ALL your data?
is byte to byte size comparison or # of files reasonable? Can a script be written to traverse and note all the folders & file names on a drive? Run it pre- and post-recovery then compare the files? Am I completely off base here?
Any insight is appreciated.
I am doing weekly full + daily differential backups to media removed from the building. I know the backup isnt worth the processing time it took to make it if the data can't be restored. And I've had success restoring a file or folder here and there as necessary. But I have yet to have a failure or other instance when I needed to restore a large amount of information.
We are a small business here and still have managed to generate enough data that its unreasonable to think eyeballing it would be good enough to say ALL the data was recovered.
I want to do a large scale test run on our file server sometime soon but would like to come up with a good strategy for proving that I was able to recover everything.
If you were doing a test run to prove your backup system how would you verify you recovered ALL your data?
is byte to byte size comparison or # of files reasonable? Can a script be written to traverse and note all the folders & file names on a drive? Run it pre- and post-recovery then compare the files? Am I completely off base here?
Any insight is appreciated.