Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Defragmentation on Windows NT

Status
Not open for further replies.

storer

Programmer
Mar 27, 2000
41
US
I have windows NT at work. We have Windows 98 at home and my husband is forever running the defrag. He says I should do this at work, but can't on Windows NT. He's concerned that I'm going to crash and lose everything. Our company's technical support doesn't think it's necessary but I sometimes wonder about their "expertise." Do I need a defrag program for Windows NT? Is it good to run defrag on a regular basis? Our son, who I thought was rather knowledgable, says it's not a real good defrag all the time.
Please help me end this debate! Thanks!
 
There are several third party applications that defrag windows NT. I personally have run NT without defragging for years, but I have no doubt that if i were to take the time it would speed things up. As for the "losing" everything. There is really not much more chance of losing any data on a 'fragged hard drive than a non-fragged drive. The hard drive manufacturers rate the "unrecoverable read error" rate by number of bits read, which does not change just due to fragmenting. Having said that, BACKUP ANYWAY. Regardless of the pros and cons of defragmenting, it is always a good idea to have a backup. As for the defragmenting procedure, it reads and writes virtually every file on the hard drive. It stands to reason if there is any problem, chances are it will occur DURING the defragmenting process! For my Win9X users, I recommend defragmenting no more than once every "few" months-depending on work type and amount. The average user does not need to defragment more than once a year (if even that) As for the NT users, if they are running NTFS, I usually tell them not to worry about it. If the data access off the hard drive slows down noticibly, then I take a look at it. 99% of the time it still does not need defragmenting!
David Moore
dm7941@sbc.com
 
Defragmenting I think did not occur as much, a few years back. However, applications are many times the size that they use to be.

The amount of defragmentation also depends on how your paging file is configured. It is good to have the file static to a set size. You can determine the ideal size by using task manager ALT+Ctrl+Del tab to performance, look at "Commit Charge" Look at the peak size. Set your swap file 75 megs bigger than this amount. If you "current commit" is ever bigger than your "peak size". Increase the size again.

Microsoft has added defrag to Windows 2000.
The company that provided the utility is Executive Software.
The product is Diskeeper they have a free version on thier web site. However, if you purchase the software they have a "set it and forget it" feature.
I have had Servers running for two years and never had to worry about defraging again. It is all done in the background at a determined time.
Hope this helps.


-john
 
John, your point is good; fixing the size of the swapfile increases performance, because the system is not continually recalculating how big it should be; but my own taste for fixed swap file size uses the following formula:

Total pagefilesize = 2.5 x RAM. This should cover any peaks you would see using performance monitor - and also unexpected peaks that you may only see if you ran perfmon for a month or two.

Split this over 2 partitions (where practical), so that the size of the pagefile on partition 1 = RAM x 1 + 12Mb. This allows the system to save crash dumps and registry files.

Put the remainder on the second partition. This has the added advantage of saving some space on the system partition.

Also, allow the registry to grow to 32Mb. It will probably never reach this size, unless you are running terminal services, but think of it as insurance.

Defragmentation is not a major issue on NT servers, in my experience.

I tend to use this 2-partition model to install the %systemroot% to partition 1, and applications to partition 2. The average disk size in server class machines is 9Gb, so there is plenty of scope here.

I hope this information is useful
 
One point that's being missed is that a common and very good best practice for a network environment is to require that all work files be kept on a server drive. With the local hard disk being used only for applications and OS, fragmentation is not really an issue.
Jeff
masterracker@hotmail.com

If everything seems to be going well: you don't have enough information.......
 
Thanks for everyone's input! I'll share this info with my significant other!
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top