Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

HELP!!! 36Gb log file!!!

Status
Not open for further replies.

mm8294

MIS
Oct 12, 2001
73
US
HELP!!!

We need transfer all 52 million records from table apple into banana, we have 36Gb free space available, and I have already put the database into simple recovery mode, but the transaction still ate up all free space and failed. I have been trying several days using either the Import Wizard or insert sql script in SQL Analyzer. This is a very important project and all users are waiting for me! And the boss already got upset. A friend told me I can do it in SQL Analyzer: add a line before the insert script and the transaction will not be logged, but he does not know the details.

Could somebody help me with this? It's killing me.
 
Any command like this will be logged. All transaction in SQL Server are logged. Setting the database to the simple recovery model will simply tell the SQL Server to purge the transaction after the transaction is commited.

You've got a couple of options here.

One is to export the data to a flat file using BCP then import using BCP. When doing large imports/exports via BCP I always recommend using the -b flag. This tells BCP to import/export the data in batches of X rows (I usually use 1000 rows at a time). With out the flag BCP will attempt to load up all 52 million records in a single transaction. If you set the flag it will load 1000 records them commit. Load another 1000 records then commit.

The second is to use DTS to move the data. The easiest way to do this is to run through the Import/Export wizzard. When you get towards the end you'll be prompted with a page that asks what you want to do with the package. One option is run now, another is save to SQL. Uncheck the run now and check the save check box. Then save the package to the SQL Server. Now edit the package. Double click on the black transformation line. On the options table towards the bottom there will be 4 boxes you can type into. Two will be labeled something to the effect of "Rows per Batch" and "Commit Size". Set them both to 1000. Click ok and run the package.

For either of these technicques to be effective the database should be in SIMPLE recovery mode. If the import fails or is stopped you will need to go and truncate the destination table before starting it again.

Denny
MCSA (2003) / MCDBA (SQL 2000) / MCTS (SQL 2005) / MCITP Database Administrator (SQL 2005)

--Anything is possible. All it takes is a little research. (Me)
[noevil]
 
Mrdenny,

Thank you very much for your help. Both of your methods make sense to me. I am trying your 2nd method now, the transfer is in progress and I believe it takes at least 6 hours. I will let you know tomorrow if it works.

 
Another thought is to move the transaction log file and/or the Tempdb to another logical drive with more free space. Sometimes the Tempdb will also increase in size along with the Transaction log which is why I mention it.

To move the Transaction log, make sure you're not running anything against it (and fore-warn the end users so they don't freak out when the db disappears), detach the database, and cut-n-paste or copy-n-paste the transaction log to your new HD. Reattach the db pointing to the new location of the transaction log and then run your transactions.

FYI: This is something you should do only when you're seriously crunched for space and have tried all other options. Other options tend to be better solutions than the "hard drive shuffle". @=)



Catadmin - MCDBA, MCSA
"No, no. Yes. No, I tried that. Yes, both ways. No, I don't know. No again. Are there any more questions?"
-- Xena, "Been There, Done That"
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top