Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations derfloh on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Importing currency data containing a "$" and commas 1

Status
Not open for further replies.

montypython1

Technical User
Jan 12, 2005
187
US
Greetings and Happy New Year,

What is the best way to import currency data containing "$" and commas?

In the past, I would just build a cursor and then append the records into the cursor without any problem. This time, the data contains "$" and commas, which means that I have to import the data as "character", rather than "numeric" (which means that I must then strip out the unnecessary characters). There has to be a more elegant solution. I tried setting the appropriate fields as "currency" before importing the data, but it truncated everything smaller than 1000 (ex: $29,999.99 became 29.0000).

Any suggestions?

Thanks,
Dave Higgins
 
Hi Dave,

You didn't mention which format you are importing from, or how you are importing it.

Also, when you say you set the "appropriate" field to currency, did you mean the field you are importing from, or the field you are importing into? I would have thought that, if it was the destination field, that would do the trick. The fact that it is truncating at the comma suggests that you might have SET POINT set to a comma rather than a dot.

Another possibility might be to import the values as character strings (which is what you currently doing), and then programmatically convert them to numeric or currency afterwards. You would need to add a field to the destination, and then do something like this:

Code:
REPLACE ALL NumField WITH ;
  VAL(CHRTRAN(CharField, "$,", ""))

Mike

__________________________________
Mike Lewis (Edinburgh, Scotland)

Visual FoxPro tips, advice, training, consultancy
Custom software for your business
 
Dave,

Sorry, but when I said you should make sure SET POINT is not set to a comma, I was talking rubbish (not for the first time).

VFP is seeing the comma as the termination of the numeric portion of the value. SET POINT won't make any difference to that.

If you could strip out the commas before the import, that would fix it, but presumably you can't to do that.

My other suggestion is more likely to be what you need.

Mike

__________________________________
Mike Lewis (Edinburgh, Scotland)

Visual FoxPro tips, advice, training, consultancy
Custom software for your business
 
First, is this a one-time import or will this need to be done repeatedly over time?

If a one-time event, then merely put the data into a Text file format, and convert the necessary data prior to import into VFP.

If this is to be a repeated event, then you should most likely write a routine to recognize this situation and handle it accordingly.

You could always read each imported 'field' into its own character field in a VFP cursor/table and then convert it as Mike suggests above.

Should you have commas within the currency value as you say above, you will need to first write out the file using some other field delimiter (perhaps TAB or Pipe '|') so that the commas within the data will not be confused with field delimiting commas. Then eliminate the commas within the currency field values.

Good Luck,
JRB-Bldr

 
Hi Mike and JRB,

Thanks for your quick response and your suggestions.

Mike, sorry ... I forgot to mention that the data will be coming to me in ".CSV" format. There are approximately 50 fields that contain the $ and comma characters, and the data needs to be imported on a daily basis (which is why I would like to automate this process).

JRB ... you were correct about accidentally deleting the comma delimiters (which is exactly what I did when I first tried to do a mass replace on the file before importing (using FILETOSTR, STRTRAN, and STRTOFILE).

Since I have so many fields to convert, is there a way to programmatically convert a CSV file to a TAB-delimited file (before importing into VFP)?

Thanks,
Dave
 
Are you saying that you have a CSV file and that the data contains commas as well? Unless the commas are inside quote-delimited fields, that's going to be tricky.

Can you show us a little of the data. A few lines with a few fields per line.

Tamar
 
Since you say that the data file will be ".CSV" format, are the data fields separated with a Quote-Comma or just a Comma?

Unfortunately a CSV file can be sent either way.

If Quote-Comma, then you can use
MyString = STRTRAN(MyString,'","',CHR(9))
to convert intermediary field delimiters to TAB.

Then you would need to change the First & Last field Quote characters with
MyString = STRTRAN(MyString,'"','')
so that these fields aren't left with un-balanced Quotes.

-- * --

Alternatively you could use
FOPEN(), FGETS(), STRTRAN(), FCLOSE()
to parse out each line separately and make the conversions on-the-fly as needed prior to import into the VFP data table.

Actually this might be a preferred approach since it would allow you to 'test' the file, based on the contents of the first line or two, before processing to see if it included the '$' character and therefore needed this extra processing.

If the '$' was absent, then you could by-pass these extra steps.

Good Luck,
JRB-Bldr


 
Hi Tamar and JRB,

Excellent suggestion about showing some sample data, as it caused a lightbulb to turn on inside my head. Originally, when I looked at the data, it opened inside of MS Excel. I must have closed and re-saved the file (which changed the original format). As it turns out, the raw data (before I accidentally re-saved it) is actually a hybrid file format that I've never seen before: fixed-length combined with comma-delimited surrounded by quotes (even for numeric values).

Here is a small sample of the data (font may not show as fixed width, but it is):
Code:
","Sale Price     ","Down Payment   ","Trade Allowance","Trade ACV      ","
","$8,691.49      ","$800.00        ","$2,000.00      ","$2,000.00      ","
","$23,631.95     ","               ","$2,826.88      ","$3,500.00      ","
","$11,987.00     ","$500.00        ","$3,059.42      ","$1,500.00      ","
","$20,500.00     ","$5,000.00      ","$1,500.00      ","$1,200.00      ","
","$16,987.00     ","$3,000.00      ","               ","               ","
","$20,999.00     ","$2,000.00      ","$1,500.00      ","$1,500.00      ","
Does this data in the original file format help my cause (of stripping out the $ and commas within each field)?

Thanks,
Dave


 
So long as the record and field lengths are absolutely identical from file to file, import as an SDF, and then convert as per Mike's original suggestion.
use filler fields for the quotes and commas surrounding each field

eg
Code:
CREATE CURSOR temp ( f1 C(1), saleprice C(15), f2 C(3), downpayment C(15), f3 C(3), tradeallow C(15), f4 C(3), tradeacv C(15), fn C(1) )

APPEND FROM myfile.txt TYPE SDF

CREATE CURSOR final (saleprice Y, downpayment Y, tradeallow Y, tradeACV Y )

INSERT INTO final SELECT VAL(CHRTRAN(saleprice,"$,","")), VAL(CHRTRAN(downpayment,"$,","")),VAL(CHRTRAN(tradeallow,"$,","")),VAL(CHRTRAN(tradeacv,"$,","")) FROM temp
 
You'll want to delete the first record in TEMP after the APPEND command, as it will contain the column names which you don't want to convert
 
Dave,

Your sample looks slightly odd. It looks like the double-quotes are surrounding the commas rather than the data. In other words, you have this:

Code:
","$8,691.49      ","$800.00        ","$2,000.00      ","$2,000.00      ","

But I think it should be this:

Code:
"$8,691.49      ","$800.00        ","$2,000.00      ","$2,000.00      "

However, be that as it may, I'm inclined to think you should go with my original suggestion, but also to create an intermediate cursor.

So, first import the file as it is into a cursor, with each field treated as a character field. Then do a pass of that cursor, inserting each record in turn into the final table. At that point, convert the character fields to numeric using the VAL(CHRTRAN(CharField, "$,", "")) construct I mentioned earlier.

Mike


__________________________________
Mike Lewis (Edinburgh, Scotland)

Visual FoxPro tips, advice, training, consultancy
Custom software for your business
 
Hi Mike and Brigmar,

Thank you for the suggestions.

Mike, you are correct about the data sample ... I should have excluded the first 2 characters (",) and the last 2 characters (,") in my data sample.

I will be working on your suggestions today and hope to have good news to report.

Thanks again.
Dave
 
[ ]
However, be that as it may, I'm inclined to think you should go with my original suggestion, but also to create an intermediate cursor.

Mike, in the original data sample I could easily see the value of using an intermediate cursor. Could you enlighten me about the value of using one with the corrected data sample since I just don't see any reason?

mmerlinn


"We've found by experience that people who are careless and sloppy writers are usually also careless and sloppy at thinking and coding. Answering questions for careless and sloppy thinkers is not rewarding." - Eric Steven Raymond
 
Mmerlin,

My suggestion is to do the job in two phases. First, import the source file into an intermediate cursor (or table). Then, do a pass of that cursor, converting each record in turn and storing the results in the final target table.

The reason I suggested an intermediate cursor was that the first phase would result in character fields. The cursor would hold these intermediate values, pending their conversion to the final data types.

Does that answer your question? If not, let me know and I'll try to clarify.

Mike


__________________________________
Mike Lewis (Edinburgh, Scotland)

Visual FoxPro tips, advice, training, consultancy
Custom software for your business
 
Hi Mike,

How do you move the data from the intermediate cursor into the final cursor, while simultaneously converting the data from character to numeric?

I like your idea of setting up 2 cursors, since the first cursor allows me to verify that all the data is being imported, while the second cursor can be used to handle all of the converted data. Here is what I've done so far:

- Created "Cursor1" with all fields as Character
- Imported data into "Cursor1" (note: all data appears as expected, along with $ and commas)
- Created "Cursor2" with all appropriate fields as Numeric
- Then I attempted to replace data from "Cursor1" into "Cursor2" using the following code:
SELECT cursor2
REPLACE ALL price_num1 WITH VAL(CHRTRAN(cursor1.price_char1, "$,", ""))
REPLACE ALL price_num2 WITH VAL(CHRTRAN(DBF(cursor1.price_char2), "$,", ""))

My code (above) is not correct since it resulted in blank fields.
In the past, I've successfully converted data from within the same cursor, but I do not know how to pull and convert data from 1 cursor into another cursor.

Would you mind clarifying one more time?

Thanks,
Dave
 
Dave,

This is what I had in mind for the second phase (this would be done after you have done the initial import into the character fields):

Code:
SELECT Cursor1  && the one with character fields
SCAN
  SELECT Cursor2  && the target cursor or table
  APPEND BLANK
  REPLACE ;
    price_num1 WITH VAL(CHRTRAN(cursor1.price_char1, "$,", "")), ;
    price_num2 WITH VAL(CHRTRAN(cursor1.price_char2, "$,", ""))
ENDSCAN

There are probably other ways of doing it (for example, using SQL, such as INSERT INTO Cursor 2 .... SELECT .... FROM Cursor1), but I think the above code would be the simplest.

By the way, although we're referring to Cursor1 and Cursor2, they could of course just as well be physical tables.

Mike


__________________________________
Mike Lewis (Edinburgh, Scotland)

Visual FoxPro tips, advice, training, consultancy
Custom software for your business
 
Hi Mike,

Hey, that is pretty cool ... and it worked like a charm! Thank you.

You have streamlined my lengthy code into a few elegant lines of code, which is exactly what I was looking for. My previous attempt got the job done, but involved multiple unnecessary steps: (1) building the same "cursor1" for import of data; (2) then using a SELECT statement to create "cursor2" in order to add additional numeric fields; (3) replacing these new numeric fields with data; (4) deleting the original character fields.

Your process is better, and I learned something at the same time.
Thanks again.

Dave Higgins
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top