Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

poor backup performance for large number of files

Status
Not open for further replies.

ManfredSchilling

Technical User
Sep 4, 2002
6
0
0
DE
We backup an NetApp NAS-Filer F810 via a Win2K SP2 2,4 GHz / 512 MB System to an Solaris 6.1.2 E250 NetWorker-Server. All Systems are GBit connected, backup data are written to a filetype device. We have already testet the W2K Client-Versions 6.1.1, 6.1.2 and 6.2 but backup-speed seems to be limited to approx. 100.000 files/hour:
win10: NAS_02 level=full, 49 GB 07:53:52 814509 files
incremental backup run::
win10: NAS_02 level=incr, 4186 MB 01:37:15 100618
Turning off compression had no effect, no special directives are used, virus-scans are disabled.
No bottlenecks on CPU, Memory, Disk- and Network-IO can be found on NAS, W2K and NW-Server during the backup (Systems seem to be idle most time). What's the limiting factor, what can we do to speed up performance ?
 


Have you tried the NDMP & SnapImage to backup the filer directly to the solaris backup server? you would see better performance.

One probem you are running into here is that the Windows system has to read info from the filer that pass it over to the backup server, you network takes a double hit.

Also backing up a filer this way is not supported by legato any more since the NDMP has been integrated with networker. Also you can attach a tape drive to the filer and back it up directly this would just require an NDMP connection (not SnapImage).

If you are using a jukebox it can be shared by the filer and the server, jukebox sharing is a feature of network and power edition.
 
You're not alone in this discovery...here are my stats on UNC-based vs NDMP based backups:

NAS Configuration
- EMC Celerra with 2 datamovers
- SCSI attachment to EMC-3430
- 10 "Volumes"
- 500 GB
- 3,400,000 files

UNC-based backup takes over 30 hours (about 105,000 files/hour)

NDMP-based backup runs for approximately 14 hours
- STK-9714 directly attached to Celerra with 4 DLT-7000's (2 per datamover)
 
Hi,

I agree with "john7" it is windows that is the bottleneck, not Legato itself. I have heard that in Networker 7 there should be something different in the way Network handles huge filesystemm, but I don't know how.
I have seen the same problem when backing up a SAN, we invested in expensive DDS licenses and FC-attached drives to get realy fast backups, when we later tried to backup these clients over the normal 100 Mbs/s network, we got better performance that way. These servers had over 1.000.000 files each.

Anyone has experiences with SnapImage ?

/Staffan
 
Thanks to all who replied !
It seems to be that Windows 100.000 files/hour limit you can't beat, whatever configuration your're using.
Has anybody experience in NDMP/SnapImage based backup? We see following disadvantages: at least one tape-drive is reserved for NDMP (not usable for normal backups), backup to disk is not supported, you can't clone the backup, what about restoring single files from this backup? (our NW-Version is 6.1.2)
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top