Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Circumventing The Transaction Log Bottleneck 1

Status
Not open for further replies.

jkb3

Programmer
Jan 5, 2001
9
0
0
US
We are performing a migration from BTrieve to SQL 7. Last night while inserting the results of a two table join that resulted in a table of 14.6 million records, we found that increasing the size of the tempdb, while allowing automatic growth of the log file, with "Truncate on Checkpoint" set to on, was the only way to get this operation to finish.

Is there a way to perform a DML statement such as an insert or an update, but to limit it to say 1 million records at a time? In other words, I want to perform the same query or procedure, but in smaller chunks. Because the incoming data is so twitchy, I do not want to break the query up into parts, because I think in this case, the whole will not equal the sum of the parts.

Thanks

jkb3
 
I'm not sure of your process but SQL Server allows you to limit the number of records processed in a transaction by setting the ROWCOUNT.

Set ROWCOUNT 1000000

Use Set ROWCOUNT 0 to allow processing of an unlimited number of records.

Is this what you were looking for or are there other considerations? Terry
 
As a quick test, try setting the Recovery Interval to something like 200 and see if that makes a difference. Robert Bradley
teaser.jpg

 
That is probably the answer. Just a follow up question. So if I use the SET ROWCOUNT = 1000000 statement, will my process stop after the first million, or will it continue in increments of a million?

I understand that this might have to be programmatically induced.

Thanks for steering me in that direction. I will research some more. Thanks.

jkb3
 
Thanks folks.

We did get this to finish, I was just trying to optimize this so that it would not take an hour to complete. I will tweak the recovery interval, as suggested by foxdev, but I believe the SET ROWCOUNT option is not a valid solution, because processing stops upon reaching the rowcount limit.

Thanks again.

jkb3
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top