Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Data Classification Enabler

Status
Not open for further replies.

hutchingsp

Technical User
Sep 27, 2008
48
GB
I installed this today as our main file server holds around 9 million files and backs up fine but it takes two hours to do the scan on a full backup.

Anyway, in a nutshell is there any more to it than install, and then change the iDataAgent to use the DCE as the Scan Type?

I'm not planning on changing it until Friday when it runs a full backup (I wasn't sure if switching scan type would force it to try and run a full backup next incremental).

What difference have those of you using the DCE found it made to your scan times please?
 
Once you install DCE you need to update it, this is a seperate update to the one for the iDA. If you havent pulled the DCE update into the updatecache you can run the update from the client and then choose DCE from the update menu.

There is also a DCAdmin program in the Commvault program group. With this tool you can select volumes to filter and do graphs and some other bits. With 9million files I imagine your DCE database is quite a size too (probably over 1GB), which you need to account for. DCE works by reading from the USN in real time, there is some CPU and disk space requirments but your scan phase will be enhanced significantly.

Obviously you need to pick DCE at the backupset level as the scan engine and this will not force a full backup, its not like switching storage policys. BooksOnline and the Knowledgebase on MainAdv does provice some good info on DCE.

---------------------------------------
EMCTA & EMCIE - Backup & Recovery
Legato & Commvault Certified Specialist
MCSE
 
Thanks for the reply - just looked on Maintenance Advantage and there are a few DCE updates, not sure which I need though or if it's just 0758 and 0951?
 
Oh, and as you suggested the database files on the various LUNs are pretty big - does this account for the seeming lack of responsiveness in the admin utility?

It seems to spend most of its life with an hourglass - maybe this will change after the re-scan after applying those two updates.
 
Yes... 9million items means the database will be quite a size. The size is obvioulsy poportional to the amount of items. When the database build is complete then CPU and memory are consumed in amounts that will be porportional to the rate of change.

I would not bother with updates 0758/0951, you need to update the entire commcell to SP5. CommServer first, the MediaAgents, then clients. Dont forget to run the update for DCE also.

regards


---------------------------------------
EMCTA & EMCIE - Backup & Recovery
Legato & Commvault Certified Specialist
MCSE
 
Thanks - I'm already on SP5 but didn't have the DCE installed previously and I tend to let the CommCell/Automatic Updates handle updates, not often I download/run a Service Pack and I wasn't aware it gave the option to install a distinct SP for the DCE - which I've just done.

I may watch when the backup kicks off as I'm curious, but if the DCE maintains an "on the fly" database of files/folders on disk, what is there for it to do when a backup kicks off?

Presumably it has a database at that point in time so how come there is still a (from the sound of it fairly lengthy) scan process?
 
During the scan process a collect file has to be created which contains what data will be backed up. You should see the the DCE scan is significantly shorter than ChangeJournal or Classic scan. 9million is still 9million.

You can check the scan log on the client to see which scan method is being used and then check how big the collect file gets (which is in the JobResults folder).

You can make huge savings in the backup phase by using SythFull rather than Full. This way you only ever do incr backups of the client once the initial full is done.

Hope that helps.
BTW, DCE is also great for DataArchiver and can now (in ver 8.0) be used for ExchMailbox backups. In all cases DCE reduces the scan phase enoumously but its not seconds.

regards


---------------------------------------
EMCTA & EMCIE - Backup & Recovery
Legato & Commvault Certified Specialist
MCSE
 
There's 1.2gb from the last full backup in the Jobresults folder from the looks of it, I was using Change Journal but have changed it to DCE for the full tonight and from now on.

The SynthFull is "interesting" in that I'd totally misunderstood how it worked. I'd assumed I take a full today (for example) that is, say, 20 tapes, and next week and subsquent weeks my "full" actually just writes X tapes (say 1-2) that represents the change delta.

It seems it actually copies tape to tape the original fulls and incrementals?

If that's the case, I suspect we're actually quicker doing a regular full as the media agent is the file server and we average 350gb/hr on the full backup.

Really appreciate the input on the DCE by the way - I've found Commvault Support to be pretty good when you have a problem but often when you want clarification on something they have a tendency to just point you to the manual.
 
SynthFull is really only viable if your incr are going to disk. You could have an incremental storage policy whereby the incr go to disk and the synthFull goes to tape provinding your last full/sythFull is online at the right time.

A SythFull has to read back all the incrementals since the last full/synthFull including the last full/synthFull in order to synthetically construct a new synthFull. Which is why this works fine when reading back from disk. Having to loadup various tapes and seek for the bits it needs will slow things down.


Since you are going direct to tape then Incr/Full is correct. 350GB/h is excellant so no need to change what you're doing here.

---------------------------------------
EMCTA & EMCIE - Backup & Recovery
Legato & Commvault Certified Specialist
MCSE
 
Hmm OK well the scan doesn't seem to be taking any less time than I'm used to.

Process Explorer/Task Manager show the ifind.exe process is running and using a fair bit of CPU and ditto some of the gxdc.exe processes.

Is there any way to confirm the scan is using DCE?

I'm wondering whether or not the DCE databases finished rebuilding from the post service pack rebuild, and if they were rebuilding at the time the backup kicked off whether Galaxy waits or just defaults to a Change Journal scan?
 
The job as it is deplayed in the JobController should indicate the scan method. I had already suggested reviewing the scan log, but you can also pick the job and choose view logs if you prefer this way.

I would have thought a DCE database of a volume with 9million files would take at least 8hours to rebuild. So I would hazzard a guess and suggest that its defaulted back to Classic.

---------------------------------------
EMCTA & EMCIE - Backup & Recovery
Legato & Commvault Certified Specialist
MCSE
 
It was change journal. Looks from the DCE logs as if I had one volume that took hours (about 8 as you say) before they show the database as being initialized.

Fingers crossed for the next incremental!
 
Hi,

How did your incremental go? We currently have a Call logged with CommVault for exactly the same issue - it was recommneded to us to use DCE for our Fileservers since the Incremental scan can take up to 3 hours. After having set everything up as per the documentation, and allowing the databases to initialise properly, we are still not seeing reduced scan times.

The FileScan.log is reporting

FindWorkThreadWrapper::Run() - Not all paths could be scanned thru DC. Falling back to the other scan methods

and then the Change Journal scan is used.

As yet, CommVault support have not been able to find why this is happening. Apparently "It should just work!" :)

Will keep you informed if we discover anything. Good luck!
 
It's with Commvault - let's just say the DCE seems pretty poor so far, the documentation sucks and it just doesn't appear stable whatsoever.
 
Unfortunately I would have to agree with you. The documentation is near non existant and even CommVault support do not seem to know much about it.
 
Hi Calippo

Thanks for your reply. Yes, we are well versed in Books Online, which usually a great source of information as you point out. However, in this situation (specifically using the DCE for Windows iDataAgent Backups, and troubleshooting this setup) they have been a bit short on information - other than the method for changing the BackupSet to use DCE.

I have another remote session with CommVault support scheduled soon. Hope to learn more after that...
 
The documentation is there, but it's no use when the thing doesn't work, or when you need a log deciphering.

I noticed the documentation in 8.0 for the DCE has been updated which makes me at least hope the 8.0 DCE is new and hopefully improved.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top