Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Insert Thousands of Rows into Sql Server

Status
Not open for further replies.

noonan59

Programmer
Oct 27, 2003
29
0
0
US
Hi - Just wondering what anyone thinks would be the most efficient way to insert potentially 15 - 20 thousand records into Sql Server 2000 from an asp.net application?

My scenario is that I'm pulling data in from a text file that could contain up to 20,000 rows into a dataset. Then I validate each dataset row to make sure each column contains valid data. Then I need to save each dataset row into a sql table that will hold the valid data. If there's a validation error then I throw the record into an error table.

Obviously this can be quite time consuming so I'm trying to decide the most efficient way to accomplish this task.

Thanks!
 
I would create a different application (e.g. a web service or a windows application installed on the server) then this could just be called/started from your application.

----------------------------------------------------------------------

Need help finding an answer?

Try the search facilty ( or read FAQ222-2244 on how to get better results.
 
Depending on what resources you're concerned with, you could dump valid rows back to a file and then use one of the bulk processing methods of SQL Server (bcp/BULK INSERT/DTS) to insert the data. This would probably be the best way to go if your greatest concern is database resources, especially if other processes might be using that table and locking would be an issue.

________________________________________
Andrew

I work for a gift card company!
 
I didn't mention that I also need to check to see if a validated record is already in the table, in which case I would need to update instead of inserting the record.

I don't know if it will happen often, but it's possible that I could have 20 or so people using this site at the same time. In which case they would each be validating their own text files and inserting into the valid table simultaneously.
 
Well, if you can do your checking in a stored procedure instead of in .NET, your best bet is surely to bulk copy to a temp table and then do all your standard processing from the SP. When processing data, it's always best to keep all the data and all the processing local to the database server.

But, that's not an option (or at least not a desired one). From .NET, you'd probably want to set up your database commands with parameters. Just reset the parameters and execute the same command objects again in each iteration. It will save the database from having to parse your query at all, as it will already have the execution plan saved for it.

________________________________________
Andrew

I work for a gift card company!
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top