Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

DAO Recordset String Limitation 1

Status
Not open for further replies.

TopJack

Programmer
Mar 10, 2001
153
GB
Im using Excel 2003 with VBA to extract data from one spreadsheet to another. This has been working great for sometime now using DAO and applying an SQL routine.

The problem I have now is that the spreadsheet Im copying from now contains a string in a cell that is quite large (potentially 32,000 characters - Excel cell limitation). The resulting DAO recordset from the SQL seems to ignore characters beyond 1,016 chars (clips the original string).

Where can I find out about the limitations of a DAO recordset, eg storage space, and also is there a way to increase the available string length in a specific DAO recordset field ?

If you need me to post an extract of the code here please ask.

Any help is greatly appreciated.
 




Hi,

How did you determine that the recordset value was truncated?

Skip,

[glasses] [red][/red]
[tongue]
 
Very good question.

This problem has confused me totally. It must be something easy that Im not seeing. This is a simplified extract from my code so you can get an idea of what Im doing :-

Code:
Sub my_routine()

Dim my_file As String, temp_string As String
Dim my_db As DAO.Database
Dim my_rs As DAO.Recordset

my_file = "c:\test.xls" 'path and name of file

Set my_db = OpenDatabase(my_file, False, True, "Excel 8.0; HDR=Yes;")   'declare database with configuration
Set my_rs = my_db.OpenRecordset("select * from [Sheet1$]")   'declare recordset and execute SQL statement

temp_string = my_rs.Fields(0)   'this string length is 1,016 characters long

ActiveCell.Value = temp_string  'output value to sheet

End Sub

I run this code against a one record file and it works. I run it against the real file (about 40K records) it truncates the same string to 256 characters (sorry I wrongly said the length was truncated to 1,016 chars before).

I dont understand why this truncation is happening. Some kind of memory corruption?! or me just being stupid.
 



That still did not answer the question I asked.

Here's what you may need to do; break the text up into 256 byte chunks...
Code:
Dim a, idx as integer, 
...
do
  a = Split(my_rs.Fields(0)," ") 

  for idx = 0 to ubound(a)
     if len(somecell.value & a(idx)) > 256 then
        somecell.Value = a(idx) & " "
        set somecell = somecell.offset(1)
     else
        somecell.Value = somecell.Value & a(idx) & " "
     end if
  next
  my_rs.movenext
loop until my_rs.eof
of course, you'll want to add data to the row for other fields that do not change.

Skip,

[glasses] [red][/red]
[tongue]
 
Sorry Skip, I didn't make it clear that the "ActiveCell.Value = temp_string" showed me that the text was truncated.

I tried your idea of splitting the text into 256 character chunks before sending the output to the sheet but I was still getting the same problem. The recordset had already truncated the text before I can do anything with it. This did send me off in the direction that maybe the SQL statement was automatically defining a data type that was wrong for my data. On the first record of my source sheet I forced 32,000 characters into the cell .... and guess what ..... the code ran perfectly after that.

It appears I need to specifically declare my field data types in the SQL statement because it is incorrectly defining this particular field. I do remember reading somewhere that Excel (I think it was about pivot tables) reads the first so many records (say 200) and decides on what data type its likely to be - I think this is my problem - the first few hundred records in my file are small strings, where later on in the file the strings get much larger.

The question now is, how do I define my data types in an SQL SELECT statement or can I manipulate the recordset before it receives the data.

If the SQL needs changing, Im not sure what syntax to use, maybe ...........

"SELECT [F1] as CHAR(32000),[F2] as CHAR(10) FROM [Sheet1$]"

Do you know much about SQL, or is this a question for another forum ?
 



I cannot duplicate your problem.

I set up an Excel workbook with a sheet contrining over 20,000 rows, each row with 24720 bytes of text.

I got them all using your code modified.
Code:
Sub testDAO()
    Dim my_file As String, temp_string As String, lRow As Long, iCol As Integer
    Dim my_db As DAO.Database
    Dim my_rs As DAO.Recordset
    Dim fld As DAO.Field
    
    my_file = "d:\My Documents\dbTestLON _String.xls" 'path and name of file
    
    Set my_db = OpenDatabase(my_file, False, True, "Excel 8.0; HDR=Yes;")   'declare database with configuration
    Set my_rs = my_db.OpenRecordset("select * from [Sheet1$]")   'declare recordset and execute SQL statement
    
    lRow = 2
    Do
        iCol = 1
        For Each fld In my_rs.Fields
            Cells(lRow, iCol).Value = fld.Value
            iCol = iCol + 1
        Next
        lRow = lRow + 1
        my_rs.MoveNext
    Loop Until my_rs.EOF
    my_rs.Close
    my_db.Close
    
    Set my_rs = Nothing
    Set my_db = Nothing
End Sub

Skip,

[glasses] [red][/red]
[tongue]
 
Change all your rows of data to contain only 1 byte of information except the last row which should contain your 24,720 bytes of information.

I did the same test with your code above and not only does it not work, the open workbook crashes with an unexpected error when it reaches the 24,720 bytes because it cannot read it. There appears to be a mismatch in data types and the recordset can accept values intially but if you try to use the value it does not like it.

Thanks for trying this for me - its driving me mad.
 
Skip, take a look at this MS knowledge base article. It sounds very similiar to my issue :-

htt_p://support.microsoft.com/kb/194124
 



Interesting!

Excel had a major abort with the long cell last.

If I copied the long cell to row 4, it ran to completion correctly. My code only inserts data in the output sheet if Length > 256, so I got two rows; one from row 4 and the other from the last row, 20637.

Skip,

[glasses] [red][/red]
[tongue]
 
In reading further it shows that I have found one of the limitations of the Excel ISAM (indexed sequential access method) drivers used with this DAO (data access objects) method.

The "OpenDatabase" command with its connection string determines what column data types are to be used. Unfortunately for me this command normally by default will only look at the first 8 rows of column data to decide on the field types - in my case I can't gaurantee that the first 8 rows are a good representation of my data. The connection string can use "IMEX=1" which will allow mixed data types but this can be overriden if for example the first 8 rows are numeric, therefore the data type will be defined as numeric regardless and any text there after will be ignored.

I have two possible workarounds so far :-

1) Change the way Excel checks the first 8 rows to work out the data types. This parameter is held in the registry under "HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\3.5\Engines\Excel" where you can change the value of "TypeGuessRows" to something other than 8. If the value is set to zero all the column data will be scanned to determine field types - this brings with it an overhead on performance.

or

2) If I place a dummy record in the first 8 rows that represents my data, it can be fooled into defining a better data type. In my case a (first) record which contains a cell with 256 characters will allow the driver to define the correct column data type for all other rows.

If anyone finds a better way to solve this problem, please, please, let me know.

Thanks for helping me find this Skip.
 
extract data from one spreadsheet to another
Why not using the Copy method of the Range object ?

Hope This Helps, PH.
Want to get great answers to your Tek-Tips questions? Have a look at FAQ219-2884 or FAQ181-2886
 



MS Query is quite useful to get data from other sheets or workbooks, especially when you need to join data in a union. Of course, you could copy 'n' paste, but I find that query is usually a better solution.

TopJack, if I were in control of the source workbook, which is often NOT the case, I'd make the first row of data, the dummy trigger row, and then I'd program to include the row in the query, but exclude the row from writing to the sheet.

But your research regarding the registry key is valuable. I would seldom encounter such a situation, but it is good to have a ace up your sleeve.

TopJack ==> * for initiative and persistence. :)

Skip,

[glasses] [red][/red]
[tongue]
 
Thanks PH for showing the interest. Unfortunately I wish it was that straight forward. I know in my original post I said that I was copying one sheet to another but this was the simplified version of what my project entails, hence the reason for using DAO.

I read a workbook that contains hierarchical information. This information is displayed in a TreeView (user interface) where the user decides what information is sent as an output to the target workbook. Using DAO and putting the source information in a recordset makes using the TreeView realitively easy. The source file is very large as an Excel workbook so using DAO does give a better performance rather than opening and closing big files.

Just on a side matter, this TreeView functionality is awesome. I wish I had found it earlier.

Thanks guys.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top