Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Memo field to Excel 2

Status
Not open for further replies.

Jerim65

Technical User
Aug 8, 2010
99
AU
I am developing an app that 'plays' with timelines in tables.

The 3rd party app using these tables has 4 fields - 3 numeric for date and one memo for details on that date.

My app successfully allows importing and exporting of prepared timelines from/to csv or xls files.

A user has noticed that my app only allows approx 200 chrs to be imported/exported due to the requirement of using a chr field for the contents of a Memo field.

Would it be possible to use more than one chr field to hold the contents of a Memo field when importing or exporting?

How would I measure the longest text content in a Memo in a table and create a table with enough chr fields to store the longest in?

Thanks in advance

Coldan
 
First, the limit of char fields is 254 in foxpro. There might be even longer texts.

I wouldn't go with the idea to seperate memos into an array of char fields for excel export, there is a solution to export memos with their full text to excel by excel automation, and you should simply use that and the problem is solved.


And to create CSV files with the full memo contents use this:

Bye, Olaf.

Bye, Olaf.
 
Hi Coldan

Another way you can do this to CSV is using STRTOFILE()

In this example - "quicknotes" is the memo field

Code:
* CSV File Name
myCSVFile = 'ShortList.CSV'
* Header Row
mheader = 'accountno,name,sincedate,quicknotes,email'
STRTOFILE(mheader+CHR(13)+CHR(10),(myCSVFile))
* Run through your table
SCAN
	* Data Row
	mdataline = TRIM(accountno)+','+TRIM(name)+','+DTOC(sincedate)+;
	',"'+ALLTRIM(STRTRAN(quicknotes,CHR(13),''))+'",'+TRIM(email)
	
	STRTOFILE(mdataline+CHR(13)+CHR(10),(myCSVFile),.T.)
ENDSCAN
* CSV File... Done
If you encounter any problems with memos containing double quotes, you can fix them using STRTRAN() as follows:

Code:
STRTRAN(memo_name,'"',"''")
I found that using two single quote marks '' converts to a double quote " automatically within the resulting text... (handy for measurements in inches etc)

Karen
 
Karen's method of creating a CSV has an important advantage over the Microsoft method quoted by Olaf.

With the Microsoft method, all the line endings in the memo get changed to spaces. In Karen's method, they are changed to a line-feed. That's important, because Excel will interpret the line-feed as a line-ending within the cell contents. So the original line breaks will be preserved.

Of course, you could adapt the Microsoft method to do the same.

Mike

__________________________________
Mike Lewis (Edinburgh, Scotland)

Visual FoxPro tips, advice, training, consultancy
Custom software for your business
 
I have tried both methods.

Both seem to have trouble where both doublequotes and commas are used as part of the memo field.

As an example


The party of "Croziers" is founded, under leadership of the bishop of Oslo, Nikolaus Arnesson.

becomes 3 separate columns in the spreadsheet rendition of the csv file


The party of "Croziers" is founded,
under leadership of the bishop of Oslo,
Nikolaus Arnesson.

my code in part is

Code:
 myevent = STRTRAN(event,'"',"''") 
 mdataline = STR(year,4,0)+','+STR(month,2.0)+','+STR(day,2.0)+;
    +','+ALLTRIM(STRTRAN(myevent,CHR(13),''))+;
    +',';
    +ALLTRIM(myevent);
    +','

I've tried 2 variations for the event field as mentioned by Karen.

Any suggestions how to deal with this in each of the methods given by Olaf and Karen?

Thanks

Coldan

Thanks


 
There is a reason MS does not stay with the line feeds and replaces them by a space, as VFP can't cope with CSV files haveing a linefeed in a record, for VFP - if using APPEND FROM file TYPE CSV you need a line per record. To maks commas, text fields are always in quotes, so commas inside quotes don't seperate columns and you don't have to mask that by changing commas to something else.

If you have your own methods of reading in CSV files, then you'll need to adapt that.

To handle the effect of loosing line breaks with the MS method you can do something like

Code:
lcMemo = lcMemo + ;
                     ALLTRIM(MLINE(EVALUATE(laGetFields(lnCount, 1)), ;
                                   lnLoop)) + '<<linefeed>>'

Then later remove all '<<linefeed>>' strings with a real linefeed again.

Advantage of KAren's code obviously also it's shorter, but you need one of such codeblocks for every different export, the microsoft method is universally working for any table like COPY TO is despite it's not exporting memo fields.

Bye, Olaf.
 
Mike,

My first code line above is
Code:
 myevent = STRTRAN(event,'"',"''")

which is the implementation of Karens method.

According to her explanation this deals with the double quotes such as in the inches abbreviation say 89" and the " in my example above which is successfully managed. I get -The party of ''Birchlegs'' is founded - in my spreadsheet when opening the exported CSV file which is fine.

I am still left with having to deal with a comma within the string formed from the memo field which causes the split into columns as shown above with both given methods.

Overall,as these conversions will only be up to say 1000 records I was thinking of replacing a comma within the myevent string with say 2stars ** and warning the user.

When such a set is IMPORTED I can change them back to a comma in the DBF table.

Is this sound thinking or is there another way?

Olaf's answer appears to be dealing with the issue you raised and not my real problem.

My import export is always concerning the same structure of a table N,N,N,Memo which shows in my code

Code:
 mdataline = STR(year,4,0)+','+STR(month,2.0)+','+STR(day,2.0)+;
    +','+ALLTRIM(STRTRAN(myevent,CHR(13),''))+;

I understand the replacement of any linefeed with the myevent translation.

Thanks

Coldan

 
Coldan,

again, what I already said: COPY TO does enclose all string fields in double quotes, which is why double in a memo are problematic. But commas should not be, because commas within double quotes should not make any application split the value to two fields, including excel.

If excel does, then how do you import into excel and how is your file extension. I think Excel will handle files with ".csv" extension correctly in this matter. If you make it .txt excel may be working wrong indeed.

Bye, Olaf.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top