Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations John Tel on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Converting Excel files to DBF

Status
Not open for further replies.

ajpa

Programmer
Jan 31, 2002
52
GB
When I convert an Excel file into DBF format Excel truncates many of the fields. I can expand the cells in Excel, and that overcomes the problem to some extent, but the mere fact that the contents of a cell are entirely visible doesn't seem to ensure that they will all be copied.

Is there an easy way round this problem? I know I can get round it via CSV files, but that's longwinded.

 
Expand the cells in Excel, copy to new Excel file and save. Than convert new file to DBF
 
Hi;

Keep in mind that FoxPro has a limit of 254 characters per field and 255 fields per record.

Ed Please let me know if the sugestion(s) I provide are helpful to you.
Sometimes you're the windshield... Sometimes you're the bug.
smallbug.gif
 
Gandalf23

There is a work around for that limit. I ahve the code at home.
 
You can also use code that is normally used to move data from foxpro into Excel and just reverse the references. I'm not sure if you're looking for a solution for just one type of spreadsheet or for various ones. One sheet is easy to write a program for. It's the dynamic stuff that gets hairy...

You may want to refer to: thread184-268816

I definitely agree that Excel plays poorly with other programs. I am currently working on a VFP utility that uses either COM or "save as CSV" methods to pull/import data from any Excel spreadsheet, decides if the data column is a string, number, date, and how long the field needs to be and then creates a table with header names that match the 1st row of the Excel file.

COM isn't fast at all, say 30 seconds per thousand records with 100 columns on a PIV 1.7... but I *know* that every value is right.

The CSV method is many times faster. I have a form with a text field which gets populated using getfile() and it automatically opens the file, removes all delimiter characters, makes every column 255 wide and saves as a temp.csv file (all with excel.visible=.f.: much faster that way!) which is then appended into a *starting point* DBF with however many character columns are needed for the data where I then decide what kind of data is in them and format them accordingly... but I still don't feel comfortable that there won't be an import error....

I have also recently figured out how to run a remote view from Excel into VFP... unless I'm missing something, it has to be the most inefficient thing you could possibly try... under 1,000 lines a minute and all character fields are imported as memos.

Let me know if I can make anything more clear...

Brian
 
My suggestion will not work with dynamic Xcel arrangements, but works best for me when I know the structure of the incoming spreadsheet.
Save the spreadsheet as an Xcell 5.0 spreadsheet and append that file to an existing .dbf with the fields properly defined. I use Xcell 5.0 because I am running VFP 5.0 and that is the latest Xcell it will recognize. You must experiment with whatever you are using.
When it comes to dynamic spreadsheet structure, you need to make sure all cells are defined as character before they are saved as a .dbf. Otherwise, you run into problems like dropping leading zeroes in numeric fields you import into char() fields, and much more.
 
Living in England, I'm in bed while you guys are busy offering me advice. Thanks for this bundle of replies.

Yes, I can do it the hard way. I have a table called RECEIVER with character fields f1,f2 and so on that I can populate from a csv file, then resize, type, rename and so on; and some of that can be automated. It's just that saving an excel file as a dbf file with a single keystroke is so neat potentially, that it's CRAZY for Microsoft to get the details so badly wrong. I assumed that I was just missing something, but I evidently wasn't.

culleoka's idea looks appealing, but when I try it I get a 'Not a table' message.

Isn't there some way this forum can exert pressure on Microsoft to get its act together?

Tony
 
Here's the code for the automated XL -> CSV -> DBF that I've worked up... there are a few variable for which you'll need to assign values...I have them on a form because I never know what the data is going to look like.

Brian
*************

on error WAIT WINDOW NOWAIT "file in use..."
on error
wait window "Formatting and exporting data...." nowait
copy file &excelfile to c:\temp\temp4import.xls && excelfile is form variable
excelfile="c:\temp\temp4import.xls"
loExcel.Workbooks.Open(excelfile)
loExcel.DisplayAlerts = False
*loExcel.Visible = .T.
XLRange=firstcolumnletter+alltr(str(headrow))+":"+lastcolumnletter+alltr(str(linesofdata)) &&form variables
loExcel.Activesheet.Range(xlrange).NumberFormat="0.00000" && this was good enough for me...
loExcel.Activesheet.Range(xlrange).Replace(Chr(34), "")
loExcel.Activesheet.Range(xlrange).Replace(",", "")
loExcel.Activesheet.Range(xlrange).Replace(Chr(39), "")
loExcel.Activesheet.Cells.Select
loExcel.Activesheet.Range(xlrange).ColumnWidth = 255

loExcel.ActiveSheet.SaveAs("c:\temp\exportscsv.csv",6) &&6=XLcsv

loExcel.Workbooks.Close()
Release loExcel

Use importtemplate
append from c:\temp\exportscsv.csv type delim
 
Brian,

My reply to you has got lost. Perhaps it will reappear soon. In the meantime, I see that you explain loExcel in the thread you pointed me to. No need to reply again.

Tony
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top