Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations derfloh on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Search results for query: *

  1. wlynch

    Extracting alpha from messy data

    What would be good is if the client did'nt insist on using such crazy data in the first place. Thanks to you all. The code does help.
  2. wlynch

    Extracting alpha from messy data

    I think that is what I am after Olaf. I'll have to play around with it this morning. The entire file has lots of different letters in it and this looks like a winner. Thanks. Will
  3. wlynch

    Extracting alpha from messy data

    Thanks. I'll look and see if I can figure something out. WL
  4. wlynch

    Extracting alpha from messy data

    Is there a way to extract (eliminate) all the alpha data from a string that looks like this? they want the numbers, not the letters. Thanks. 63(J) 64(J) 67(J) 35(J) 97(R) xx63(J) xx96(J) 805(K) T32(J) T6(J) 44(J) xx702(J) 45(J) xx11(J)...
  5. wlynch

    Speeding up this process

    Interesting stuff. I am working on the suggestion. To answer one question, I don't care if the inserted record is first or second in order. The index idea sounds like a winner JRB. Mike and Olaf, you two always seem to have an answer. Again, thanks to you all!
  6. wlynch

    Speeding up this process

    Cool Mike! Ok, are the fields: Field Field Name Type Width Dec Index...
  7. wlynch

    Speeding up this process

    Thanks!! I'll have to study all this. To answer your question DSumm, "Why would you want to have duplicate records with only the 'fullname' field being different?" That is intentional. The DO NOT MAIL record is a QC piece that is pulled off the production line at the end of each pallet when...
  8. wlynch

    Speeding up this process

    There is a faster way to accomplish this but I am not sure where to start. This places an additional line into the primary database and I know how much filters and insert blank will slow the process down. Any ideas? ****************** Set Deleted On Select 1 Use Mydatabase Set Safety Off Set...
  9. wlynch

    Delimited files

    Thanks Olaf-- you are correct, our files are many times in excess of 1MM or more records. The records need to go out to an inkjet machine in single delimiter format for some odd reason. Actually, any delimiter would work but no closing quotes can be present. I always check for imbedded commas...
  10. wlynch

    Delimited files

    Ahh.. since I found a way to produce that the program outputting the original dbf file will add the header as the first record, it turned out all I had to worry about was producing a comma delimited file, hence I ended up not needing that small snippet you suggested. Thanks Olaf!
  11. wlynch

    Delimited files

    Sure, no problem. Here is what I used CLOSE ALL CLEAR ALL lcFieldString = '' lcMemo = '' USE GETFILE('dbf', 'Select DBF') && Prompts for table to be used. lnFieldCount = AFIELDS(laGetFields) && Builds array of fields from the && selected table. *!* Prompt...
  12. wlynch

    Delimited files

    Got it!! All your suggestions and a bit of fiddling around did the trick. Still, sure would be nice if you could just copy to.... etc, etc. and get what you want. LOL Thanks to you all.
  13. wlynch

    Delimited files

    The link you gave me worked great for creation of a file with comma and no quotes. Unfortunetly it does not include the header record. I appreciate the effort.
  14. wlynch

    Delimited files

    I've read as many posts as I can find to answer the question of how to produce a comma ONLY delimited file with NO quotes but I can't find how to include the header at the same time. COPY TO c:\myfolder\mytextfile.txt DELIMITED WITH "" with CHARACTER , This will work but gives me a file with...

Part and Inventory Search

Back
Top