Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Analysis Server Error: Memory error

Status
Not open for further replies.

johnem

MIS
Apr 6, 2003
18
AU
Dear Listers,

We're experiencing a memory issue in Analysis Services cube production. The error is reported as :
Analysis Server Error: Memory error [The server computer is running low on memory. Close some applications, and then try again.]

The cube is partitioned and uses a snowflake schema of 15 dimensions with 14 measures. Each partition (4 partitions) houses approx 18 million rows. The cube processing task occasionally completes successfully after a restart of Analysis Services.
The machine is configured with 4Gb RAM and dedicated to Analysis Services although the Windows 2000 OS restricts memory usage to 2Gb per process. As such our configuration setting of Process Buffer has been set to 1664Mb.

Recommended solutions to-date are:

- upgrade to W2K Adv. Server 2000 / 2003 to permit use of 3GB memory
- modify the cube design
- trial combinations of AS config. settings

Has anybody experienced similar memory related processing issues and achieved a successful outcome?

TIA

John
 
Experienced memory issues on a previous project but the server was already running 3GB. If you have large complex cubes I do recommend using the 3GB, remember if you do upgrade to w2kAS to configure AS to use 3GB you have to set it in the registry the AS Manager only allows the setting of 2GB. are other applications being ran from this server?

One thing to check is to reboot the server and open task manager when then server is up. Look for the msdrv.exe process and see how much memory is being used at start up.

Remember that analysis services reads all cube metadata and Dimension members into memory at start up. Also if you have dimension or cell level security for numerous roles these too eat up memory as the elements the security role allows are read into memory and remain there until restart or reboot. Also I am not sure if SP3 or SP3a included the fix but there was an issue that in large cubes with large metadata definitions you could run out of memory simply by drilling down in the Analysis Services Manager tree view.

"Shoot Me! Shoot Me NOW!!!"
- Daffy Duck
 
I almost forgot the cube that is processin when this occurs does it contain a distinct count measure and how many aggregations?

"Shoot Me! Shoot Me NOW!!!"
- Daffy Duck
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top