Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Getting data from CVS file into Foxpro table 1

Status
Not open for further replies.

csr

Programmer
Jul 20, 2000
507
0
0
I need to put the data which is presently in a comma delimited (CVS) file into a Foxpro table. What is the most efficient way of accomplishing this. Without further direction I am left to creating the Foxpro file using the CREATE command and then using APPEND FROM to pull the data into the file. Is that about the only way to do this ?


Don


 
THat is about it.
It can be done in a program.
Open the VCS file
Read the first line
Count the number of Comma's
close the CVS file
Create table with one column type varchar Len(yourguess)
do a loop
for x = 1 to lnCommaCount
lcField = "Field"+transform(x)
alter table tablename ADD (lcField) v (yourguess)
endfor

this will leave you with one more column then comma's
after the inport you can check that column for any data
if there is any, the impaort failed bacuse of extra comma's in the file.
Redo the above steps untill the last column is blank.

Hope this gives you some ideas

David W. Grewe Dave
 

If your CSV file has only comma delimited data in it, the above method works. But if there is other information, like headers, footers, etc, mixed in with the data you need to extract, then it won't work without a LOT of work.

Many times UPS has sent me CSV files with extraneous info in them. I found that the easiest way to scrape those files is to use low level functions to extract only the data while skipping over the rest of the 'junk'.


mmerlinn

"Political correctness is the BADGE of a COWARD!"

 
Or SaveAs Excel and IMPORT FROM File.xls TYPE XLS
 
I think the following is a common practice for .CSV Comma Seperated Volume files. At least it seems to be how Excel interprets imported .CSV files.

Some fields are delimited/enclosed with "Double-Quotation-Marks" on each end. This happens if the following characters are necessary to be used within a field.

1) "Single-Quotation-Mark"
2) "Double-Quotation-MarK
3) an actual "Comma"

Since we've had problems with VFP-7's Wizards importing these .CSV files converted by Excel, I've written a VFP program that uses Low-level VFP Functions.

1) Prompt the user for Header lines and which line to start the data import from. If it has a Header-Row, you can get the exact number of fields each data row should have using:
lnField_Count = OCCURS(lcField_Delim, lcField_List)

2) I get the Maximum field widths for "N" number of rows or check widths for all Data Rows.

3) Loop the data rows using:
lcText = ALLTRIM(FGETS(lnINPUTHandle, lnBytes))
Which Reads bytes until CHR(13), the end of line CR character is encountered.

Here is the parseing logic for each extracted line of data:
Double-Quotation-Mark OR the user selected, Field-Delimiter-Mark is encountered, then...

Skip any subsequent delimiters until an even number of Double-Quotation-Marks have been passed over. Thereafter, the NEXT delimiter is the actual position of the end of that field.

Unless another Double-Quotation-Mark is found first. Then continue to keep skipping subsequent delimiters.

4) Next I load a Grid control to present a possible VFP .DBF file structure, for the user to alter as needed for Field Widths and Data Types.

5) Then I load the Grid on the Preview-Tab with a sample of the data.

This is my WIP, a Quick 3-tabbed pageframe on a VFP Form. Should be much simpler, faster, and work correctly. This is actually easier then editing the VFP Import Wizard and has a more efficient visual interface, with more modern coding methods then the Ancient VFP Import Wizard Code.
 
My idea was to created an Import-Wizard-Plus version.

Sorry this is not a program I distribute. It isn't even completely done, yet. Was not a priority, and I got sidetracked. But I know it will work, I've been programing over twenty years in dBase/Vfp languages. If you need help in debugging any code, I'll be happy doing that for you. I can try to do it this weekend.


dgrewe's idea will not work if there are commas in fields, Such as a comma followed by an Apartment or Suite number, a comma follwed by a Title, or a City followed by a State.
 
I have done this in the past, its a pillock of a job as the whole thing gets easily confused.

Try loading it into excel and saving it as an XLS file
VFP can pick up this sort of file directly

Create a holding database then use
Code:
IMPORT from [filename] TYPE XLS

 
Yea that was the old way of doing things (importing/opening in Excel). Excell will do it's unwanted formatting tricks.

I'll try and finish my program this weekend and post it here.
 
Bertiethedog:

If you open the CSV in Excel, then just 'save as' DBF III file. NO NEED to a) create Skelton DBF, b) Save as Excel, c) Import type XLS. Make sure the Title Row have legal Field Names.
 
That Excel method works most of the time and it is efficient.

1) I does require manual user intervention.

2) It also fails if there are any commas in the columns/fields to be imported.

I didn't get a chance last weekend to finish my program to resolve these issues. I will do it as soon as I can.
 
Hi justamistere:


you wrote "2) It also fails if there are any commas in the columns/fields to be imported.
"


I just tried the csv file of the form

"My Name","6872 MyStreet, Suite 31","MyCity, State"

Excel 2003 imported the above in 3 columns.
Is that what you mean ?

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top