Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Windows Server 2008 tweak 1

Status
Not open for further replies.

apc1234

Technical User
Jun 19, 2009
17
US
Hi,

In Windows Server 2008, is it possible to clear the amount of cached memory without a reboot?

Path is:
Task Manager --> Performance --> Physical Memory --> Cache.

When I run SQL 2008 Analysis services, it's loading a large amount of data into memory and even after the application is closed, cached memory does not seem to clear out.

Any ideas?

Thank you, in advance, for all suggestions.
 
That's because the data an application for SQL Analysis is cached in the memory.

You can get some more detail here:


Basically, it is normal for memory used by recently executed programs to consume memory in a "Cached" status. If that program is launched again then the cached memory will enable faster startup and improved performance of that application. If that memory is needed by another application then it should be cycled out as normal.

Is it causing some sort of problem?

If the memory is not in use then those pages will be marked available and re-used when the next application needs memory.

This behavior is perfectly normal, and I wouldn't worry about it unless it is causing issues.

________________________________________
CompTIA A+, Network+, Server+, Security+
MCTS:Hyper-V
MCTS:System Center Virtual Machine Manager
MCSE:Security 2003
MCITP:Enterprise Administrator
 
Hi,

The issue we are noticing is that the cached amount is not clearing. Once we load Analysis services and close it, the cached amount does not change even though we load another instance of Analysis services or any other application.
Any suggestions on clearing the cached memory?

Any help is appreciated.

Thanks.
 
It doesn't need to clear, it is a cache. It contains useful information that will be used on subsequent executions of SQL Analysis Services. If another application needs that memory before you run SQL Analsys Services again, then the amount of memory needed will be released for that application. It is functioning as intended.

Let's look at it this way: a CPU has many places where it can store data. The fastest location is in the CPU's registers. Unfortunately, there are very few registers. So there is also a Level 1 CPU cache. This is an extremely fast bit of memory that is built into the CPU to store data and instructions. Because it is part of the CPU and is very fast, it is also fairly small (to keep costs down). After Level 1 cache there are also often (on modern CPUs) a Level 2 and Level 3 (or L2 and L3) cache. These higher level cache areas are progressively larger than the lower level cache areas, but they are slower to access. But they are still extremely fast compared to system memory.

The next level of storage down from there is system memory. This can be a MASSIVE data store, but is much slower than any level of CPU cache. However, it is much much faster than hard disk storage. Hard disks are, of course, many many times larger than the system memory of a computer, but is also many many times slower.

So basically you have a data storage hierarchy that looks like this:

Hard Disk
RAM
L3 cache
L2 cache
L1 cache
CPU Register

All layers store data. As you move down the stack you get to progressively faster but smaller storage areas. A well optimized system will make use of the faster storage first, so the most frequently used data and instructions will reside lower in the stack. If you consider that the largest data set available is on the hard disk, a well-optimized computer will keep as much of that data set in RAM, cache, or register as possible. The most recently and frequently used data will always be retained in the faster storage areas.

When an application (say, SQL Analysis Services) runs, the needed data is loaded from disk to RAM, and some bits do get loaded lower in the memory stack if they are more frequently used. When the application has completed running and has terminated, data that was used by the application will remain CACHED in RAM in the event that it is needed later. In the event that another application needs memory and there isn't memory available, parts of the old data that is cached in memory will be flushed and replaced with new data that is needed.

THIS IS NORMAL AND AS DESIGNED.

The problem that I suspect that you are running into is a problem of perception, not a technical issue. For many years it has become common practice to be concerned when memory utilization was high. For some reason people thought that if they didn't see lots of free memory then their computer didn't have enough RAM. But the reality is, if your computer has lots of free memory then you have way too much memory installed. It is far more efficient to have a server with 4GB of RAM running at 95% memory utilization than it is to have a server with 8GB of RAM running at less than 50% memory utilization, right?

So with the newer version of Windows Microsoft has re-written the way that memory management works. With Vista they implemented Superfetch, which basically tries to predict what bits of data you will need and pre-emptively load it into main memory from disk. When Superfetch predicts correctly it will dramatically improve performance. Server 2008 does not use Superfetch, but it does do something similar. Being a server OS and therefore more sensitive to workload, it does not try to predict what data will be needed and load it in advance. Instead it keeps previously used data cached in memory in the event that it is needed again, and flushes the cache if more memory space is needed.

As I asked before, is there a specific problem that having data cached in memory is causing, or are you just concerned by seeing the numbers?

________________________________________
CompTIA A+, Network+, Server+, Security+
MCTS:Hyper-V
MCTS:System Center Virtual Machine Manager
MCSE:Security 2003
MCITP:Enterprise Administrator
 

The problem is we are iteratively testing a problem and we have to reboot the server every time we need to load new data/application to memory. So, is there a way to flush the cache without the need to reboot the server?

Thanks.
 
I don't think so, but if your iteratively testing a problem I would recommend reboots between runs to begin with.

I had a similar issue in the past where they wanted a fresh run during performance testing of a particular app. They found that the first run was always significantly slower and wanted to reboot between runs. But I pointed out that since the server would only be rebooted only once a month (and then only if there were security updates to be installed that required reboots) that the cached testing was more representative of real world use.

________________________________________
CompTIA A+, Network+, Server+, Security+
MCTS:Hyper-V
MCTS:System Center Virtual Machine Manager
MCSE:Security 2003
MCITP:Enterprise Administrator
 
Sorry I forgot to post the fix for this issue: it was Server 2008 SP2.

Thank you for all your posts and suggestions.

kmcferrin - special thanks to you for your explanation; used your entry here to explain cache to my manager ;-).


 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top