Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Import Record Limit?

Status
Not open for further replies.

gothbabe55

Technical User
Jun 5, 2002
25
AU
Hi all

I am trying to import data from a text-type file into a table, and all works just fine when the number of records in the file to be imported are <=64999. As soon as the number of records reaches >=65000, the import falls over. Does anyone know if there is a limit to the number of records that can be imported and/or if there is a way around this without breaking up the import file?

Thanks!
 
While I would guess that there must be some limit, I would have to say that after using VFP7's Import Wizard to import 55,000,000 records from Text files into a Free Table (DBF), I would assume that the limit (if there is one at all) is WAY UP THERE (possibly the ubiquitous 2 Gig limit?)

Note that these records are typically made up of only 8 to 10 fields of about 10 to 20 char max. each.

Also note that the 55 M record import takes MANY hours to complete on my client's 1 Gig Pentium workstation.

Good Luck,


JRB-Bldr
VisionQuest Consulting
Business Analyst & CIO Consulting Services
CIOServices@yahoo.com
 
Thanks for the info jrbbldr. One question for you though - the file you imported, can you tell me what size it was? The one I am importing is 111MB.
 
Is the size difference in your text import files due
to number of columns along with number of rows?

Darrell
 
Hi Darrell

It is row related. The records in the file (bar the header) are actually fixed length strings where the field is identified by the location in the string. The problem started when the sample import files of only a few records imported just fine, but the live import file, which has way more than 65000 records, fell over with an invalid subscript error. To cut a long story short, I spent hours going through the live file to see if there was dodgy data, and identified the problem as being number of records rather than an invalid character, with the magic number appearing to hinge on record (or row)65000.

Thanks!
 
I don't know what your error(s) really are, but you do not have a problem because of file size. The input file may be corrupt. The limit is 2 GB, and 255 columns. There is no row limit other than file size. If size were an issue VFP would appear to freeze and looking in Explorer would show the table to be 2 GB.

I've imported >2GB text files by using strtran() or substr() using low level file functions on the text file to more efficiently convey/store the same information with less data, so that the actual file I'm importing is less than 2GB, even if it means splitting the file into multiple files so that I can support >255 columns (you just need to remember to take the primary key for each table).

There are several ways to find the problem without spending hours.

For a fixed width file, you can read the file with fget(), which works line by line, and then identify the line using the length of the returned string as an indicator of the problem. You can use the code/example in faq184-4275 as a base. e.g.

if len(lcstring)<>200 &&I expect all lines to be 200 long
strtofile(lcstring,'badline.txt',1) &&the '1' makes it additive
endif

For a variable (or fixed width) file, when a non-ascii character might be causing an issue, you can fget() the file and fput() (or use filetostr() and strtofile()) into a new file with only allowable characters remaining using the technique set out in faq184-3378. Make sure to replace illegal characters with an allowable one so you don't mess up the fixed-width characteristics. Note that not all characters may be visible to you, so the solution may work even if you don't think that illegal characters are the problem.

If you're still having a problem, please describe your error more exactly.

Brian
 
Hi Brian

Thanks for your reply. However, I am absolutely positive the issue is not corruption. The reason I say this is because of the following:

I ran the import procedure with a file of 65000 records. This crashed. I then deleted 1 record from the file and ran the import procedure again. No problem.
So I then copied one of the earlier records in the file (which did not cause an error) and placed it where the deleted record had been. On re-running the import procedure, the crash occurred again. So, definitely not invalid/corrupt data related.

Thanks!
 
Are you there is no EOF marker in the text file after record number 64999 causing VFP to think that is the EOF. To check this delete rows numbers 64999 to 65001 and try the import again.



 
Hi Guys

Thank you so much for all your help and input. We have finally found what the issue was - the person who wrote the import procedure, despite being told that approx. 300 000 records would be imported in the file, still used an array to do the import instead of creating a cursor. Problem all fixed, and thank you so much!
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top