(I posted this in a forum a few weeks ago, but received a request for this by email.. so here's comes the FAQ)
I work in an environment where we don't always have access to the configurations of our DB. That being said if we find ourselves in a situation where we are runing queries on 10's or 100's of millions of records (this happens daily), we might need to add space to the log or change the size of tempdb as the default installations we are given my not be sufficient.
Unfortunately we are NOT always allowed to do this.
We've overcome this issue with this little tip I was shown, that is, 'IF' you don't mind running queries in batches.
This is just an example of how to do this and NOT an 'actual' script.
[color purple]DECLARE @maxRowsToProcess int, @exitLoopFlag int
SET ROWCOUNT @maxRowsToProcess
WHILE ( @exitLoopFlag = 0 )
BEGIN
DELETE FROM someTable WHERE blah = blah
-- OR UPDATING.
UPDATE someTable set someField = 'someValue'
IF ( @@rowcount < @maxRowsToProcess )
BEGIN
SELECT @exitLoopFlag = 1
END
COMMIT TRAN
CHECKPOINT
END
SET ROWCOUNT 0
[color black]
What this does is only processes @MAXROWSTOPROCESS at a time. thus limiting the amount of information stored in the tempdb/cache and in the tx log. once you commit the transaction and checkpoint it, the information is 'moved' to permenant storage outside of tempdb and/or your current transaction log.
Using this methodology you can now run DML (Manipulation commands) that would usually cause your database to go into LOG SUSPEND mode. This would usually immediately followed by the lovely message "all processes will be suspendeed until you DUMP TRANSACTION or increase the transction log"
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.