Well the general answer would be. Lots of memory, lots of CPU power (64bit) lots of drives.
Like everything else it depends on a number of other factors.
1) How many other apps will be running on the server?
2) Will it co-exist with the underlying RDBMS?
3) Number of Olap Databases?
4) Number of Cubes?
5) Number of concurrent users?
6) Size of Cubes
7) Size of underlying data?
8) Data Load Method.
9) Data Refresh Frequency?
These are just the things that I thought of while replying if I sit down I could probabaly double or triple the number of things to consider.
The main thing to understand is that SSAS will use everything you give it. A small cube on a server with lots of memory could in theory exist completely in memory after enough queries. This is pretty much the goal when it comes to SSAS you want enough memory for SSAS to easily build the cube aggregates, and have enough memory to cache as many results as possible. You want enough drives, not storage, so that sufficient IO exists to read data from disk for values not in cache.
Like configuring a MS SQL Server system you need to know specifics to optimize your system.
64-bit hardware, multiple CPUs and Enterprise editions of Windows Server and SQL Server are the way to go for anything bigger than a small-medium implementation, particularly anything involving data warehousing or is likely to have a large number of users.
OLAP Performance is also heavily dependant on a good dimensional model design and cube design - even a powerful server will struggle with a badly designed cube. Some cube design best practices here:
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.