Hello,
I am trying to load large text files in to a dbf. I was using the native fox file functions and it worked file until I hit the 2 gig limit. So, after reading this, I switched to use FileSystemObject (FSO), but I'm having a problem.
I use substr to split each line and keep only the parts I want in order to make the file smaller. I am loading 40 fields each between 2 and 40 characters wide out of a 4023 character wide record. The first 20 or so fields load fine and the last field loads fine, but the 20 or so in the middle load only NULL characters (ascii 0) even though there are text characters in the text file in those locations.
It does load all the records in to my table just with NULLs in the middle fields.
Using the VFP functions all the data loaded fine, until I hit the file size limit.
I am using VFP 7.0.
Thanks for any help,
Bill
I am trying to load large text files in to a dbf. I was using the native fox file functions and it worked file until I hit the 2 gig limit. So, after reading this, I switched to use FileSystemObject (FSO), but I'm having a problem.
I use substr to split each line and keep only the parts I want in order to make the file smaller. I am loading 40 fields each between 2 and 40 characters wide out of a 4023 character wide record. The first 20 or so fields load fine and the last field loads fine, but the 20 or so in the middle load only NULL characters (ascii 0) even though there are text characters in the text file in those locations.
It does load all the records in to my table just with NULLs in the middle fields.
Using the VFP functions all the data loaded fine, until I hit the file size limit.
I am using VFP 7.0.
Thanks for any help,
Bill