Guest_imported
New member
- Jan 1, 1970
- 0
I'm running MS-SQL Server for a large data analysis project. The largest database is 90 gigs with 20 gigs remaining growth space. The tables contain static data - and we are running large queries against the data.
All of this is housed in a 8 processor Compaq with a terabyte of storage, and run under Windows 2000 Advanced Server.
Here is my problem - Sometimes I can run a simple query against 5 million records in about 2 minutes. Today I am running a simple query against 500,000 records and the query has been running for 2 hours - WHICH BY MY CALCULATIONS IS 360 TIMES SLOWER THAN NORMAL. What's the deal with SQL Server? I want it to sit down and shut up and not try to actively manage my data - just wait until I submit a query and run in as quickly as possible.
I have turned "auto start" off, I have turned "truncate logs at checkpoint" on. Is there anything else I should do to improve performance for what is essentially a data warehousing application? What does SQL Server do behind the scenes that could be bringing performance to it's knees?
All of this is housed in a 8 processor Compaq with a terabyte of storage, and run under Windows 2000 Advanced Server.
Here is my problem - Sometimes I can run a simple query against 5 million records in about 2 minutes. Today I am running a simple query against 500,000 records and the query has been running for 2 hours - WHICH BY MY CALCULATIONS IS 360 TIMES SLOWER THAN NORMAL. What's the deal with SQL Server? I want it to sit down and shut up and not try to actively manage my data - just wait until I submit a query and run in as quickly as possible.
I have turned "auto start" off, I have turned "truncate logs at checkpoint" on. Is there anything else I should do to improve performance for what is essentially a data warehousing application? What does SQL Server do behind the scenes that could be bringing performance to it's knees?