Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Convertting a text CSV string to a Msaccess table.

Status
Not open for further replies.

tedsmith

Programmer
Nov 23, 2000
1,762
0
0
AU
I notice there are many examples around of using vb6 to fill a MsAccess table with data from a FILE containing CSV data.

My problem is I have the CSV data in a single string, having received it from an internet site using WinHTTP.
I want to use this text to create a table so I can use SQL to do various queries.

I could do this 2 cludgy ways by labouriously seperating each field and writing it to a table using recordsets OR save text to a file and immediately use the above method to create the table.

Is there a way of using a similar method as the CSV file to table but in effect using CSV string to table instead?

I haven't found one yet but maybe I am looking in the wrong place?
 
After some experimentation in areas with which I am familiar I came up with two minimum code solutions that probably will be fast enough and with extra error trapping cover possible intermittent errors in the original incoming CSV String data.

The first one is similar to dilettante's example but faster using INSERT INTO to write each row.
They both split the string to form array rows then insert single quotes around each row field to write the row.
With a string of 92629 characters of 5700 rows of 8 columns the first method takes 2.2 seconds, the second 1.1 seconds which I can probably live with.

Code:
Sub MakeGPSTable(MyGPSData As String)
'make table from CSV string using ADODB
Dim MyConn As ADODB.Connection
Set MyConn = New ADODB.Connection
Dim MyRecSet As New ADODB.Recordset
Dim RowArray() As String
Dim RowCounter As Integer
Dim FieldString As String
T = Timer
On Error GoTo MGTXError
MyConn.ConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source= " & MyDrive & ":\DataCollectionServer\DataCollectionServerSettings.mdb"
MyConn.Open
MyConn.Execute ("DELETE FROM GPSData")
RowArray = Split(MyGPSData, vbCr)
For RowCounter = 1 To UBound(RowArray) - 1
    FieldString = Chr(39) & Replace(RowArray(RowCounter), Chr(44), (Chr(39) & Chr(44) & Chr(39))) & Chr(39)
    Set MyRecSet = MyConn.Execute("INSERT INTO GPSData VALUES(" & FieldString & ")")
Next
MyConn.Close
Debug.Print Timer - T '(2.2 secs)
On Error GoTo 0
Exit Sub

MGTXError:
msgbox error
Resume Next

End Sub

Then second uses DAO which is twice as fast.

Code:
Sub MakeGPSTableX(MyGPSData As String)
'Makes a table from GPS String data using DAO
Dim RowCounter As Integer
Dim Mydb As Database
Dim FieldString As String
T = Timer
On Error GoTo MGTError
Set Mydb = OpenDatabase(MyDrive & ":\DataCollectionServer\DataCollectionServerSettings.mdb")
Mydb.Execute "DELETE GPSData" 'clear old table entries
RowArray = Split(MyGPSData, vbCr)
For RowCounter = 1 To UBound(RowArray)-1
    FieldString = Chr(39) & Replace(RowArray(RowCounter), Chr(44), (Chr(39) & Chr(44) & Chr(39))) & Chr(39)
    Mydb.Execute ("INSERT INTO GPSData VALUES(" & FieldString & ")")
Next
Mydb.Close
Debug.Print Timer - T '(1.1secs)
On Error GoTo 0
Exit Sub

MGTError:
Msgbox error
Resume Next

End Sub

If there was an easy way to write all rows in one SQL statement it would probably be faster still (like updating one table from another similar table takes 0.2 secomds). Challenge someone?

Incidentally this GPSData is the current locations of up to 1000 buses running around our city that I will process to feed currently up to 50 bus stations around the city with LCD screens showing real time departure info. Quite an interesting project. VB6 is still alive!
You can imagine that every millisecond counts as they all have to be updated with any changes every 20 seconds.

Once I make a table it is quick and easy to sort out from the data which bus is shown on which stop. Each query only takes a few milliseconds and I run a group query for each bus stop every 200ms and send it to the stops in turn.
 
Just to compare times could you try the following alternative.
Untested so may give errors


Code:
Sub MakeGPSTable(MyGPSData As String)
'make table from CSV string using ADODB
Dim MyConn As ADODB.Connection
Set MyConn = New ADODB.Connection
Dim MyRecSet As New ADODB.Recordset
Dim RowArray() As String
Dim RowCounter As Integer
Dim FieldString As String

[highlight #4E9A06]Dim RowDataArray() As Variant
Dim RowfieldArray() As Variant
RowfieldArray() = Array("field1", "field2", "field3", "field4") ---- change this to match your field names and number of fields
[/highlight]
T = Timer
On Error GoTo MGTXError
MyConn.ConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source= " & MyDrive & ":\DataCollectionServer\DataCollectionServerSettings.mdb"
MyConn.Open
MyConn.Execute ("DELETE FROM GPSData")
MyRecSet.Open "GPSData", MyConn,,adLockOptimistic
RowArray = Split(MyGPSData, vbCr)
For RowCounter = 1 To UBound(RowArray) - 1
[highlight #4E9A06]    RowDataArray = Split(RowArray(RowCounter), ",")
    MyRecSet.AddNew RowfieldArray, RowDataArray[/highlight]
Next
[highlight #4E9A06]MyRecSet.UpdateBatch[/highlight]
MyConn.Close
Debug.Print Timer - T
On Error GoTo 0
Exit Sub

MGTXError:
msgbox error
Resume Next

End Sub

Regards

Frederico Fonseca
SysSoft Integrated Ltd

FAQ219-2884
FAQ181-2886
 
I have no idea what code you actually tested but this shows an elapsed time for each run varying from 0.21 to 0.25 seconds here. And this isn't a particularly fast PC by current standards either.

Code:
Option Explicit

Private Const CONNSTRING As String = _
    "Provider=Microsoft.Jet.OLEDB.4.0;Data Source='$MDB$'"

Private DbPath As String
Private CN As ADODB.Connection
Private RS As ADODB.Recordset
Private FieldNames As Variant

Private Sub OpenDb()
    If Len(Dir$(DbPath, vbNormal)) > 0 Then
        Set CN = New ADODB.Connection
        CN.Open Replace$(CONNSTRING, "$MDB$", DbPath)
    Else
        With CreateObject("ADOX.Catalog")
            .Create Replace$(CONNSTRING, "$MDB$", DbPath)
            Set CN = .ActiveConnection
        End With
        CN.Execute "CREATE TABLE SOMETABLE(" _
                 & "F1 TEXT(25)," _
                 & "F2 TEXT(25)," _
                 & "F3 TEXT(25)," _
                 & "F4 TEXT(25)," _
                 & "F5 TEXT(25)," _
                 & "F6 TEXT(25)," _
                 & "F7 TEXT(25)," _
                 & "F8 TEXT(25))", _
                   , _
                   adCmdText Or adExecuteNoRecords
    End If
    'Hard-code or could fetch looping over ADOX Columns of Table:
    FieldNames = Array("F1", "F2", "F3", "F4", "F5", "F6", "F7", "F8")
    Set RS = New ADODB.Recordset
    With RS
        .CursorLocation = adUseServer
        .Open "SOMETABLE", CN, adOpenForwardOnly, adLockOptimistic, adCmdTable
    End With
End Sub

Private Sub CloseDb()
    RS.Close
    CN.Close
End Sub

Private Function VSplit(ByVal Text As String, ByVal Delim As String) As Variant
    'Like Split, but returns a Variant containing an array of Variants.
    Dim Pos As Long
    Dim NextPos As Long
    Dim I As Long
    Dim ReturnValue As Variant

    Pos = 1
    Do
        Pos = InStr(Pos, Text, Delim)
        If Pos = 0 Then Exit Do
        I = I + 1
        Pos = Pos + Len(Delim)
    Loop
    ReDim ReturnValue(I)
    Pos = 1
    I = 0
    Do
        NextPos = InStr(Pos, Text, Delim)
        If NextPos = 0 Then Exit Do
        ReturnValue(I) = Trim$(Mid$(Text, Pos, NextPos - Pos))
        I = I + 1
        Pos = NextPos + Len(Delim)
    Loop
    ReturnValue(I) = Trim$(Mid$(Text, Pos))
    VSplit = ReturnValue
End Function

Private Sub AddNewRows(ByVal CsvLines As String)
    Dim Lines() As String
    Dim I As Long
    
    Lines = Split(CsvLines, vbCr)
    For I = 0 To UBound(Lines)
        If Len(Lines(I)) > 0 Then
            RS.AddNew FieldNames, VSplit(Lines(I), ",")
        End If
    Next
End Sub

Private Sub Main()
    Dim I As Long
    Dim Line As String
    Dim CsvLines As String
    Dim T0 As Single
    
    DbPath = App.Path & "\demo.mdb"
    OpenDb
    
    Line = "Name567890123456789012345,Junk567890123456789012345," _
         & "Stuff67890123456789012345,Junk5678901234567890123," _
         & "Stuff678901234567890123,Junk567890123456789012," _
         & "Stuff67890123456789012,Crap" & vbCr
    CsvLines = Space$(Len(Line) * 5700)
    For I = 1 To 5700
        Mid$(CsvLines, (I - 1) * Len(Line) + 1) = Line
    Next

    T0 = Timer()
    AddNewRows CsvLines
    MsgBox Format$(Timer() - T0, "#,##0.000")
    CloseDb
End Sub
 
Adding one line:

Code:
Private Sub AddNewRows(ByVal CsvLines As String)
    Dim Lines() As String
    Dim I As Long
    
    [highlight #FCE94F]CN.Execute "DELETE FROM SOMETABLE", , adCmdText Or adExecuteNoRecords[/highlight]
    Lines = Split(CsvLines, vbCr)
    For I = 0 To UBound(Lines)
        If Len(Lines(I)) > 0 Then
            RS.AddNew FieldNames, VSplit(Lines(I), ",")
        End If
    Next
End Sub

... increases each run by an additional 0.03 seconds or so.
 
Aren't you putting the same data into every row of the table? I suspect this would be much faster.
I would think you have to try a real 90000 long CSV string where ever character is different to test it.
The data I tested it with is the real data of an existing system downloaded from the internet.
You have to make up a CSV string using random characters to really test it.

The batch processing suggestion when applied to my DAO version halves the time again to 0.6 seconds.
I inserted dbEngine.Workspaces(0).Begintrans before and Committrans after.

The general impression I have had is that SQL add tables (insertion) is faster than using Addnew and DAO is faster than ADODB but is this the case?
 
>DAO which is twice as fast.

Well, yes. DAO is the data access technology of choice for Jet databases.

>If there was an easy way to write all rows in one SQL statement

You can with SQL Server (as fredericofonseca hinted much earlier in the thread)

>You're right, speed might not be a factor.
>the first method takes 2.2 seconds, the second 1.1 seconds which I can probably live with

So, is speed a factor or not?
 
Yes speed is a factor but it is relative. I need to know what time it will take so I can arrange other things in the App.
If it is slow I would have to arrange other things differently and split less important tasks.
My problem is I have to download the data and update every one of the (now) 72 clients once every 15 seconds.
I am therefore looking for the fastest possible so I can arrange timings of other activities and split tasks at different parts of a 15 second cycle. Plus a lot of other housekeeping tasks that happen once a minute 24 hours a day .
Eg. 5 seconds to make the table would be hopeless, 1.2 workable but .6 secs much better. 0.3 would be excellent!

I have written quite a few aps with cyclic routines in them to receive data, process and feed out to machines continuously and I know every millisecond can count.

Yes I am looking for a suitable SQL batch update query from text.
I haven't been able to get fredericofonseca's suggestion to work as a whole update.
It would appear that you still have to apply Addnew for every row.
It is likely therefore no faster.

When you mention SQL Server, are you saying this has a SQL statement that will do a batch job that Jet won't do?
 
>When you mention SQL Server, are you saying this has a SQL statement that will do a batch job that Jet won't do?

Yep. For example, SQL Server's INSERT INTO ... VALUES is more flexible than that in Jet-SQL, and can add up to 1000 rows in a single statement rather than 1 (you'd still require a teeny bit of manipulation of the CSV string to get it into the correct format)
 
No, the "speed" of DAO is almost entirely mythical aside from a few contrived cases which don't fit your scenario at all.

No, AddNew in the form I used above can be far faster than cobbling together SQL DML by hand and executing it, especially when using a firehose cursor as I did.

No, there is no need for "random" data for a test such as this. I tried it, same results: well under 0.3 seconds and most trials only 0.25 or less. Far better than your 1.1 second times.

Code:
Private Sub Main()
    Dim InsertAt As Long
    Dim I As Long
    Dim F As Long
    Dim Field As String
    Dim CsvLines As String
    Dim T0 As Single
    
    DbPath = App.Path & "\demo.mdb"
    OpenDb
    
    Randomize
    
    CsvLines = Space$(18& * 8& * 5700&)
    InsertAt = 1
    For I = 1 To 5700
        For F = 1 To 8
            Field = ChrW$(Int(Rnd() * 26) + 65) _
                  & CStr(Int(Rnd * 100000000)) _
                  & CStr(Int(Rnd * 100000000))
            Mid$(CsvLines, InsertAt) = Field
            If F = 8 Then
                Mid$(CsvLines, InsertAt + Len(Field)) = vbCr
            Else
                Mid$(CsvLines, InsertAt + Len(Field)) = ","
            End If
            InsertAt = InsertAt + Len(Field) + 1
        Next
    Next
    CsvLines = Left$(CsvLines, InsertAt - 1)

    T0 = Timer()
    AddNewRows CsvLines
    MsgBox Format$(Timer() - T0, "#,##0.000")
    CloseDb
End Sub

You not only fail to use string builder logic, but fail to use multicharacter literals where you could and instead have character by character concatentation of characters generated on the fly using the slowest possible methods. Not only do you use the slow Variant Chr() function, this is the slower ANSI flavor of the slow Variant ChrW() function! All of this slow-boat cabbage chewing is frighteningly bad programming.

It becomes harder to take you seriously each day. I can't wait to see how you manage to move the goalposts next.
 
> a few contrived cases a few contrived cases

Have to say that when the company I was working for actually carried out some tests a number of years ago, DAO did indeed prove to be faster than ADO (not by a huge margin, admittedly) for simple operations, which this is. I suppose it is possible that they were contrived cases.
 
Makes you wonder why his vaunted DAO test case takes so long then. Maybe it's all of that other bad code? The biggest downside of DAO is that it closes off the option of another DBMS.
 
> The biggest downside of DAO is that it closes off the option of another DBMS.

Indeed.
 
Yes I am looking for a suitable SQL batch update query from text.
I haven't been able to get fredericofonseca's suggestion to work as a whole update.
It would appear that you still have to apply Addnew for every row.
It is likely therefore no faster.

yes you do the addnew for each row ... but the main thing is that you only update those to the DB after all rows have been added to the recordset, not 1 by 1

Code:
For RowCounter = 1 To UBound(RowArray) - 1
    RowDataArray = Split(RowArray(RowCounter), ",")
    MyRecSet.AddNew RowfieldArray, RowDataArray
Next
[highlight #8AE234]MyRecSet.UpdateBatch[/highlight]

note that the updatebatch is after the loop

As for addnew being faster than a single insertion - it all depends on volumes.

small volumes - direct insert is normally faster

significant volumes - parameterized query is faster than direct insert, but and will be faster than batchupdate if volumes are really huge and a single update is done - note that you can do batch updates every 50k records for example which would be faster than loading 1,000,000 records into a recordset and one single update at the end.

and as mentioned by others using DAO locks you into Access - have you tried playing around with sql server express? you can use its localdb feature. only "issue" is 1GB amx memory used which may still outperform Access. database size is 10GB PER database, not instance, so really it means a limit of 10GB per table, so not really an issue if you are using Access



Regards

Frederico Fonseca
SysSoft Integrated Ltd

FAQ219-2884
FAQ181-2886
 
Thanks, I really appreciate your interests!

Don't worry, I will experiment again with dilettante's suggestion when I get the rest of the app finished.

The database is really quite a small one in this case and 0.6 sec is not too bad.
 
I tested both wrapping the updates in a transaction and batch updating. There was no advantage in doing either one, as expected.
 
But since I did sleep at a Holiday Inn Express last night [wink] I took another stab at this.

Here is an even faster refinement, taking from 0.14 to 0.16 seconds to insert the 5700 rows from a String. Bonus, it is even a little more generalized than the previous code. I don't think it has bugs left though some might well lurk in there somewhere yet.

Code:
Option Explicit

Private Const CONNSTRING As String = _
    "Provider=Microsoft.Jet.OLEDB.4.0;Data Source='$MDB$'"

Private DbPath As String
Private CN As ADODB.Connection
Private FieldIds As Variant

Private Sub OpenDb()
    If Len(Dir$(DbPath, vbNormal)) > 0 Then
        Set CN = New ADODB.Connection
        CN.Open Replace$(CONNSTRING, "$MDB$", DbPath)
    Else
        With CreateObject("ADOX.Catalog")
            .Create Replace$(CONNSTRING, "$MDB$", DbPath)
            Set CN = .ActiveConnection
        End With
        CN.Execute "CREATE TABLE SOMETABLE(" _
                 & "F1 TEXT(25)," _
                 & "F2 TEXT(25)," _
                 & "F3 TEXT(25)," _
                 & "F4 TEXT(25)," _
                 & "F5 TEXT(25)," _
                 & "F6 TEXT(25)," _
                 & "F7 TEXT(25)," _
                 & "F8 TEXT(25))", _
                   , _
                   adCmdText Or adExecuteNoRecords
    End If
    CN.CursorLocation = adUseServer
    FieldIds = Array("F1", "F2", "F3", "F4", "F5", "F6", "F7", "F8")
End Sub

Private Sub CloseDb()
    CN.Close
End Sub

Private Sub PutString( _
    ByRef StringData As String, _
    ByVal Connection As ADODB.Connection, _
    ByVal TableName As String, _
    ByVal ColumnIds As Variant, _
    Optional ByVal ColumnDelimiter As String = vbTab, _
    Optional ByVal RowDelimiter As String = vbCr, _
    Optional ByVal NullExpr As Variant = vbNullString)
    'A sort of "inverse analog" of the ADO Recordset's GetString() method.
    
    Dim RS As ADODB.Recordset
    Dim ColumnStart As Long
    Dim ColumnLength As Long
    Dim ColumnValues() As Variant
    Dim Pos As Long
    Dim RowLimit As Long
    Dim I As Long
    Dim AtRowEnd As Boolean

    If (VarType(ColumnIds) And vbArray) = 0 Then Err.Raise 5 'Invalid procedure call or argument.
    
    With New ADODB.Command
        Set .ActiveConnection = CN
        .CommandType = adCmdTable
        .CommandText = TableName
        .Properties![Append-Only Rowset] = True
        .Properties![Others' Changes Visible] = False 'Doesn't matter when using exclusive access.
        Set RS = .Execute()
    End With
    
    ReDim ColumnValues(UBound(ColumnIds))
    Pos = 1
    Do
        RowLimit = InStr(Pos, StringData, RowDelimiter)
        If RowLimit = 0 Then RowLimit = Len(StringData)
        I = 0
        AtRowEnd = False
        Do
            ColumnStart = Pos
            Pos = InStr(Pos, StringData, ColumnDelimiter)
            If Pos = 0 Or Pos >= RowLimit Then
                ColumnLength = RowLimit - ColumnStart
                If Pos <> 0 Then
                    Pos = Pos + Len(RowDelimiter)
                    If Mid$(StringData, Pos, 1) = vbLf Then Pos = Pos + 1 'Auto-handle CrLf.
                End If
                AtRowEnd = True
            Else
                ColumnLength = Pos - ColumnStart
                Pos = Pos + Len(ColumnDelimiter)
            End If
            ColumnValues(I) = Trim$(Mid$(StringData, ColumnStart, ColumnLength))
            If Not IsMissing(NullExpr) Then
                If ColumnValues(I) = NullExpr Then ColumnValues(I) = Null
            End If
            I = I + 1
        Loop Until AtRowEnd
        RS.AddNew ColumnIds, ColumnValues
    Loop Until Pos = 0
End Sub

Private Sub Main()
    Dim InsertAt As Long
    Dim I As Long
    Dim F As Long
    Dim FieldText As String
    Dim CsvLines As String
    Dim T0 As Single
    
    DbPath = App.Path & "\demo.mdb"
    OpenDb
    CN.Execute "DELETE FROM SOMETABLE", , adCmdText Or adExecuteNoRecords
    
    Randomize
    
    CsvLines = Space$(18& * 8& * 5700&)
    InsertAt = 1
    For I = 1 To 5700
        For F = 1 To 8
            FieldText = ChrW$(Int(Rnd() * 26) + 65) _
                      & CStr(Int(Rnd * 100000000)) _
                      & CStr(Int(Rnd * 100000000))
            Mid$(CsvLines, InsertAt) = FieldText
            If F = 8 Then
                Mid$(CsvLines, InsertAt + Len(FieldText)) = vbCr
            Else
                Mid$(CsvLines, InsertAt + Len(FieldText)) = ","
            End If
            InsertAt = InsertAt + Len(FieldText) + 1
        Next
    Next
    CsvLines = Left$(CsvLines, InsertAt - 1)

    T0 = Timer()
    PutString CsvLines, CN, "SOMETABLE", FieldIds, ","
    MsgBox Format$(Timer() - T0, "#,##0.000")
    CloseDb
End Sub
 
I had a nagging suspicion...

Yep, it was horribly flawed. After the 1st row it was losing the 1st column and using column 1 as 0, 2 as 1, etc. and reusing the last column value from the 1st row for every subsequent row.

This appears to fix it though, with no loss in performance:

Code:
Private Sub PutString( _
    ByRef StringData As String, _
    ByVal Connection As ADODB.Connection, _
    ByVal TableName As String, _
    ByVal ColumnIds As Variant, _
    Optional ByVal ColumnDelimiter As String = vbTab, _
    Optional ByVal RowDelimiter As String = vbCr, _
    Optional ByVal NullExpr As Variant = vbNullString)
    'A sort of "inverse analog" of the ADO Recordset's GetString() method.
    
    Dim RS As ADODB.Recordset
    Dim ColumnStart As Long
    Dim ColumnLength As Long
    Dim ColumnValues() As Variant
    Dim Pos As Long
    Dim NewPos As Long
    Dim RowLimit As Long
    Dim I As Long
    Dim AtRowEnd As Boolean

    If (VarType(ColumnIds) And vbArray) = 0 Then Err.Raise 5 'Invalid procedure call or argument.
    
    With New ADODB.Command
        Set .ActiveConnection = CN
        .CommandType = adCmdTable
        .CommandText = TableName
        .Properties![Append-Only Rowset] = True
        .Properties![Others' Changes Visible] = False 'Doesn't matter when using exclusive access.
        Set RS = .Execute()
    End With
    
    ReDim ColumnValues(UBound(ColumnIds))
    Pos = 1
    Do
        RowLimit = InStr(Pos, StringData, RowDelimiter)
        If RowLimit = 0 Then RowLimit = Len(StringData) + 1
        I = 0
        AtRowEnd = False
        Do
            ColumnStart = Pos
            NewPos = InStr(Pos, StringData, ColumnDelimiter)
            If NewPos = 0 Or NewPos > RowLimit Then
                Pos = InStr(Pos, StringData, RowDelimiter)
                ColumnLength = RowLimit - ColumnStart
                If Pos <> 0 Then
                    Pos = Pos + Len(RowDelimiter)
                    If Mid$(StringData, Pos, 1) = vbLf Then Pos = Pos + 1 'Auto-handle CrLf.
                End If
                AtRowEnd = True
            Else
                Pos = NewPos
                ColumnLength = Pos - ColumnStart
                Pos = Pos + Len(ColumnDelimiter)
            End If
            ColumnValues(I) = Trim$(Mid$(StringData, ColumnStart, ColumnLength))
            If Not IsMissing(NullExpr) Then
                If ColumnValues(I) = NullExpr Then ColumnValues(I) = Null
            End If
            I = I + 1
        Loop Until AtRowEnd
        RS.AddNew ColumnIds, ColumnValues
    Loop Until Pos = 0
End Sub
 
Sure enough, another bug.

At the end of PutString the last line should rad:

Code:
    Loop Until Pos = 0 Or Pos > Len(StringData)

Without this change it worked when the final line had no trailing row delimiter but failed if there was one. I'd been testing both cases to make sure and somehow missed it on the last pass through.
 
SELECT CASE IDEAL
I would have thought at a Holiday Inn you'd be sitting at the poolside bar surrounded by tall shapely blondes, sipping your martini(s)?
ELSE
"What will you have, Sir?"
"One part adExecuteNoRecords, two parts ColumnDelimiter and InsertAt a sprig of StringData, shaken not stirred"
END IF
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top