Hello All. I am a relative SQL novice. I have a 3rd party application that logs alarm information to a SQL database that has 1 ugly flat not-normalized table. Unfortunately I have no control over this table and it's design, the 3rd party app creates the table and logs data to it.
This table has grown to over 2 million rows and performance for queries on the data has gotten very bad (+60 secs or more).
I'm looking for suggestions on what I can do to remedy this situation. Since they are primarily interested in recent data within the last 12 months, my initial thought was to create a second database or table to move the old data to. If that seems like a good idea, then I could use some advice on what to use to do it, DTS, stored procedures, etc.
This is SQL server 2000. If there's any other information that I need to post or if I should post this in another forum, please let me know. Thanks.
This table has grown to over 2 million rows and performance for queries on the data has gotten very bad (+60 secs or more).
I'm looking for suggestions on what I can do to remedy this situation. Since they are primarily interested in recent data within the last 12 months, my initial thought was to create a second database or table to move the old data to. If that seems like a good idea, then I could use some advice on what to use to do it, DTS, stored procedures, etc.
This is SQL server 2000. If there's any other information that I need to post or if I should post this in another forum, please let me know. Thanks.