Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Loading Some Fields From a Tab Delimited Text File

Status
Not open for further replies.

FoxNovice

Programmer
Apr 10, 2008
25
0
0
US
Hi! I have a very large tab-delimited text file, and in order to stay under the 2GB limit I would like to be able to selectively load - on the fly - only some of the fields into a VFP structure. For example, if the full text record contains 100 data elements (fields), I would like to only load element 1, 3, 5, 10, 15 and 100. How can I do this without first creating the full .dbf record and then copying out only those fields I'd like to keep??
Thanks!
John
 
In the case of <2GB I'd opt for what I recommended

Code:
* set these according to the input file.
lcWantedFieldnumbersOfInputfile="1,3,7,10"
* you may automatically determine this from the number
* of tabs in the first record:
lnNumberOfInputfields = 20

* create fields as needed plus one dummy field.
Create Cursor InputCursor (..., cDummy C(1)) 
Local Array laWantedFieldnumbers[1]
ALINES(laWantedFieldnumbers,lcWantedFieldnumbersOfInputfile,",")

Local lnCount, lcFields, lnInputfield 
lcFields = ""
lnInputfield = 1
For lnCount = 1 TO lnNumberOfInputfields
   IF ASCAN(laWantedFieldnumbers,lnCount)>0
     lcFields = lcFields + FIELDS(lnInputfield)+","
     lnInputfield = lnInputfield + 1
   ELSE
     lcFields = lcFields + "cDummy"
   ENDIF
Endfor

APPEND FROM <<Inputfile>> FIELDS &lcFields

You simply need to take care if APPEND FROM can correctly convert columns of the input text file to the appropriate format. eg a date is only converted, if it is stored as YYYYMMDD in the text file. If that's not the case you are better off with importing to a text field and converting afterwards. To write more special import code like baltman did might save some time then, but then you need much more effor to solve it generally as in detecting field types from the input file.

If it's a standardized CSV text file, you could also use an odbc txt/csv driver and connect to the text file as to a remote table and then use SQL to import from it.

Bye, Olaf.
 
I missed a clause
Code:
APPEND FROM <<Inputfile>> DELIMITED WITH TAB FIELDS &lcFields

Bye, Olaf.
 
plus I forgot to add "cDummy," instead of "cDummy" and to cut off the last "," after the loop ends.

Bye, Olaf.
 
Hello,

I just stumbled on this and I was having the same problem. I was using the native fox file functions and it worked file until I hit the 2 gig limit. So, after reading this, I switched to use FSO, but I'm having a problem.

I use substr to split each line and keep only the parts I want in order to make the file smaller. I am loading 40 fields each between 2 and 40 characters wide out of a 4023 character wide record. The first 20 or so fields load fine and the last field loads fine, but the 20 or so in the middle load only NULL characters (ascii 0) even though there are text characters in the text file in those locations.

It does load all the records in to my table just with NULLs in the middle fields.

Using the VFP functions all the data loaded fine.

I am using VFP 7.0.

Thanks for any help,
Bill
 
If you have a new question, even if similar to another, you should start a new post.

And in that new post you might also explain what 'FSO' is for those of us who don't recognize the acronym.



 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top