I have to run an update on 2.5 mil records and it takes forever even when I have an index built. We have a server with 4GB memory and we allocate SQL Server to use 1-2 GB.
When I run an update like this, its bizarre, it actually slows down the other databases.
ie. I have another completely different database on this server with 50 tables. One table is a static table that has an index on it with 200,000 records. When I run this update on a seperate database, its almost like it removes this index in my static database table (they are not related at all). The index is visibly there in Enterprise Manager, but its slow. So I recreate this index and now it runs fast again. But this happens when I do a mass update in a seperate database.
Can indexes just go away on a static table? DO I have to recreate indexes like this once a day?
When I run an update like this, its bizarre, it actually slows down the other databases.
ie. I have another completely different database on this server with 50 tables. One table is a static table that has an index on it with 200,000 records. When I run this update on a seperate database, its almost like it removes this index in my static database table (they are not related at all). The index is visibly there in Enterprise Manager, but its slow. So I recreate this index and now it runs fast again. But this happens when I do a mass update in a seperate database.
Can indexes just go away on a static table? DO I have to recreate indexes like this once a day?