Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations John Tel on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

passing data to sql server 1

Status
Not open for further replies.

azrobert

Programmer
Apr 27, 2002
392
US
I need to pass a lot of data to sql server
currently I am using a stored procedure in a loop
and firing it for each record in the recordset and doing the insert. There are 6 -800 records that need to be sent and inserted.

is there a better way to do this ?
in .net i pass a xml variable and then the stored
procedure does a bulk insert, I have not seen a way to
do this in vb6

Thanks
 
There are several ways you can do this. First... can you tell me where the data originates from? If this data starts off in a file, you should really look in to bcp and/or bulk insert.

-George

"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
the data is from a recordset that I have pulled from a local sybase database. I am then trying to send the data to a web hosted sql server database.

I am doing inserts to 4 tables from 4 different recordsets.

currently I have it working looping through the recordest and I am using a stored procedure to insert each record. I know there has to be a better way, in .net I just create an xml variable of the entire recordset and pass that in the stored proc and do do a buld insert, I am looking for something like that in vb6

Thnaks,
Robert
 
There are various ways to do this. The fastest is likely to be BCP, which is a utility supplied by microsoft that is installed when you install SQL Server. This utility is the fastest way to import data in to SQL Server. A close second is the Bulk Insert command. Both methods involve a file of data. This file can be excel, csv, txt, etc...

Third in the list of performance is SSIS (SQL Server Integration Services). SSIS has a ton more functionality than bcp or bulk insert.

I encourage you to read this: 6 ways to import data into SQL Server

Recently, I started a project where I connected to a web service to get some XML Data. The xml data contained 1,000 "rows" of data. I consider this a tiny amount of data. When saving this XML file, it was approximately 500 K (on disk). It took about 3 seconds to get the data from the webservice, and then another 30 seconds to load the data in to my SQL Server database. Basically, I created a stored procedure with a text parameter, and then passed the XML in to the procedure. The procedure used OPENXML to parse the XML file in to a table variable. Then I would massage the data a little before dumping to real tables. The OPENXML part took 26 of the 30 seconds. I thought this was completely unacceptable.

Why did I tell this story? Because I want you to realize the large XML documents can be very slow to deal with.

I solved the problem by parsing the xml in vb (using the microsoft xml 6.0 com object). I created an array of string. Each element of the array contained a single stored procedure call (Ex "Exec MyProcName 'blah',2,'whatever'")

I then used an ADO connection object to execute the procedures. But here's the key point. I did NOT loop through the array to do it. Instead, I executed a single "command" that contained all the commands. This made for one large trip to the server (with a lot of data), but once there, the database was able to effectively deal with the data.

Just to be clear, I know where the source of this data is coming from, and I know what the data type are. I am not concerned about SQL Injection. My code looks like this:

Code:
    If XML <> "" Then
        Call oXML.loadXML(XML)
        
        Set oNodes = oXML.selectNodes("//NewDataSet/Table")
        Count = oNodes.Length
        If oNodes.Length > 0 Then
            Debug.Print oNodes.Length
            ReDim arTemp(oNodes.Length - 1)
            i = 0
            For Each Node In oNodes
                RecId = Node.selectSingleNode("RecId").Text
                If RecId > MaxRecId Then
                    MaxRecId = RecId
                End If
                arTemp(i) = "Exec [dbo].[TrackInsertPositionWadeGarcia] "
                arTemp(i) = arTemp(i) & "'" & Node.selectSingleNode("Source").Text & "'"
                DateTime = Split(Node.selectSingleNode("RecTime").Text, "+")(0)
                arTemp(i) = arTemp(i) & ",'" & DateTime & "'"
                arTemp(i) = arTemp(i) & "," & Node.selectSingleNode("Longitude").Text / 1000000
                arTemp(i) = arTemp(i) & "," & Node.selectSingleNode("Latitude").Text / 1000000
                arTemp(i) = arTemp(i) & "," & Node.selectSingleNode("Speed").Text / 1000
                arTemp(i) = arTemp(i) & ",'" & Node.selectSingleNode("Heading").Text / 100 & "'"
                
                i = i + 1
            Next
        End If
        Set oXML = Nothing
        
        Call ExecuteSQL(Join(arTemp, vbCrLf))
    
    End If

This is the important line.

[!]Call ExecuteSQL(Join(arTemp, vbCrLf))[/!]

I join the array in to one big string and execute it all at once.

My import time went from 30 seconds down to 2.

I hope this helps.

-George

"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
thanks for the time.

I will take a look at this later today...

Robert
 
George,

Would the method you described work with loading data from Listviews to the server if I put my listview data into an array as you demonstrate and then join it?

will you share a little more of what Call ExecuteSQL(Join(arTemp, vbCrLf)) is/does?
 
Nothing fancy.... just the nuts and bolts of running a query that does not return a result set.

Code:
Public Sub ExecuteSQL(ByVal sql As String)
    
    Dim DB As ADODB.Connection
    
    Set DB = CreateObject("ADODB.Connection")
    DB.ConnectionString = GlobalConnectionString
    DB.CursorLocation = adUseClient
    DB.CommandTimeout = 0
    
    Call DB.open
    Call DB.Execute(sql)
    
    DB.Close
    Set DB = Nothing
    
End Sub

If you are looping in vb, and inside the loop you are calling a stored procedure that does NOT return a result set, your performance will improve. The "further away" from the database you are, the more improvement you will see.

For example, if the database is local, you may not see much improvement at all. If the database is on the same LAN as your client app, you will see some moderate improvement. If your database is on another computer that you access through the internet, you will likely see a big improvement.

Basically, this removes the back and forth communication to the database. Think about the looping situation.... You make a connection, pass some data through the network to the database. The database does stuff, and then returns an acknowledgment to the client. When you loop 1000 times, each step is repeated 1000 times. If you batch your commands like I show here, there is only one trip to the server and one trip back.

The only reason I use an array is for performance. Messing with large strings is slow in any language. By using an array, each element of it is very small (a hundred bytes or so). Small strings are fast, big strings are slow. By placing the commands in an array, I get the benefit of dealing with small string. At the last moment, I join the array to make a big string. This is remarkably efficient.

Make sense?

-George

"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
Thanks for explaining it George. It does make sense. That's the one thing I've always searched for was a better way to put several hundred rows in a DB without loops that involve calling the same SP several hundred times.

My DB's are on a LAN so hopefully I'll see a little imporvement. I have 10 listviews that I'm going to store all the data from each listview in one table so putting the data into an array won't be that big of a deal.
 
Can you please post some before and after performance times?

-George

"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top