Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

255 fields limit - spliting the raw file 1

Status
Not open for further replies.

cakret

Technical User
Mar 2, 2000
6
GB
I am a beginner with Foxpro. My main use of Foxpro is processing data and running models.
My problem is that I have a raw data file with 257 fields in it(delimited with '%'). In this file there are fields that I won't use but I need the last one in the raw file. And Foxpro won't let me read this one, or even create a table with more than 255 fields.
What I'd like to do is split the raw file in 2, one with the 130 first fields and a second one with the rest. So that I will be able to merge these two tables into one, keeping only the interesting fields.
Can anyone help me with that?
Regards
Cedric [sig]<p>Cedric<br> Foxpro Beginner[/sig]
 
Try to import data to XLS, delete some columns (or split table) and that import XLS file(s) into VFP. This is the most simple way, I guess. [sig]<p>Vlad Grynchyshyn<br><a href=mailto:vgryn@softserve.lviv.ua>vgryn@softserve.lviv.ua</a><br>[/sig]
 
Create two tables, one with the first 130 columns, and the other with the remainder. Split the source file vertically (probably the hardest part). Import the two separately. MODIFY STRUCTURE on the two and remove the columns you don't want. Finally, use an SQL SELECT command like this:

[tt]select * from table1, table2 into dbf MyTable[/tt]

All of the columns from both will be combined. Warning: if, after removing the columns you don't want, you still end up with over 255 columns combined, it will fail. [sig]<p>Robert Bradley<br><a href=mailto: > </a><br><a href= - Visual FoxPro Development</a><br> [/sig]
 
I just encountered this problem myself. This may not be the most efficient way of splitting a comma delimited file with more than 255 fields into two (or more) files but it works.


create table filebreak (f1 c(250),f2 c(250),f3 c(250),f4 c(250),f5 c(250),f6 c(250),f7 c(250),f8 c(250),f9 c(250),f10 c(250),f11 c(250),f12 c(250),f13 c(250),f14 c(250),memofld m,output1 m,output2 m)

append from (comma delimited file) type sdf

repl all memofld with;
alltrim(f1+f2+f3+f4+f5+f6+f7+f8+f9+f10+f11+f12+f13+f14)

go 1

do while eof()=.f.
x=249
dcnt=0
do while dcnt<250
dcnt=occurs(',',substr(memofld,1,x))
x=x+1
enddo
repl output1 with alltrim(substr(memofld,1,x-1))
repl output2 with alltrim(substr(memofld,x,3000))
skip 1
enddo

go 1

copy memo output1 to output1.csv type sdf
copy memo output2 to output2.csv type sdf
 
This is perhaps a domain for a small VISUAL BASIC program, which would could do
the split by counting the delimiters and put it away as textfile1, textfile2 and so on -
each one no more than 254 fields.

If you will send me your ASCII-file - I can help you with the help of a good Basic-Programmer
(I just asked him) who could create an exe-file which splits your raw-data before you need them
for VFP-reading.

Regards from Germany
Klaus

klaus.briesemeister.kb@bayer-ag.de

 
have you considerd fopen low level file operation????

open file
x = number of %'s to skip
dataiwant = substr(line,at('%',x)+1)

sorry bout the lack of detail in a hurry. if you need more let me know.
 
Hi Cedric
to avoid using a BASIC-Program:

here is a vfp-program, which simulates your problem by using a string - and as your
text-file can be copied as a whole and long string into a memo-field - you can use it perhaps.

The field-limit can be given by changing the variable &quot;limit&quot; to whatever you want.(254,255 or so)

*longstring.prg

*this program divides raw-files into field-portions
*a raw-file is simulated here by the string yourstring
*I hope this is the same format you have there in your file

*in this example the limit of the table = 3 fields will be calculated in yourleftstring see bottom
*the remaining portion is stored in yourrightstring - see bottom
*if you have more than 254 x 2 fields you will need a second limit
*which of course can be calculated in advance


CLOSE DATA
CLEA

yourstring = &quot;12%123%1234%123%12345%&quot; && I assume your file is built like this

limit = 3 && for foxpro this is 254 or 255 fields max.
&& in this example the table can only have 3 fields

counter = 0 && to count how many fields your file has
firstcut = 0 && to find out, where the first split is
yourleftstring = &quot;&quot; && the first stringpart of your textfile
yourrightstring = &quot; && the 2nd stringpart of your textfile
searchedword = &quot;%&quot; && your delimiter

FOR i = 1 TO LEN(yourstring)
IF SUBSTR(yourstring,i,1) = &quot;%&quot;
firstcut = RAT(searchedword,yourstring,limit) && the first position
counter = counter +1
ENDIF
ENDFOR

? counter && just to show you the occurences of &quot;%&quot;
? firstcut && just to show you the first position where to split

yourleftstring = LEFT(yourstring,firstcut) && the left part of your file

yourrightstring = SUBSTR(yourstring,LEN(yourleftstring)+1,LEN(yourstring))
? yourleftstring && shows 12%123%1234%
? yourrightstring && shows 123%12345% = right part of file
? yourstring && the text-file itself.

* Regards from Germany

* Klaus

 
It looks like the logic I posted earlier would work much more efficiently using FOPEN() and then FGETS() to a variable within a loop that also takes the appropriate substrings and FWRITES() them to other files.

I had not explored the low level functions before... does anyone know if they are affected by the same 2 Gig limit as most other files? e.g. if I have a 5 Gig text file, can I use theses low level functions to break it up into 5 1 Gig files?

Thanks for opening my eyes infinitelo.

Brian
 
Dear all,

Thanks all for your help.
In the end I used a c++ program that split the raw text file in 2. Saddly I no more have it as we switched all the data processing from Foxpro to SQL2000. My client tended to have too big databases and raw files (~5Gb/month) for FoxPro. For 6 months I had to create 2 main tables and amend my programs accordingly. Overall it doubled my processing time.
ON the contrary SQL transac-sql is more flexible and works faster on our servers. I saved roughly 5 days out of a processing time of 10days!

Thanks again
Cedric


Cedric
Foxpro Beginner
 
I've recently explored this some more... below is a program that runs through a &quot;^&quot; delimited text file and sums a balance field and calculates a balance weighted average score.

I have found it takes approximately twice as long as doing the same calculations using SQL (22 seconds vs. 11 seconds on a 52MB text/111MB dbf containing 496,000 records), but on a plus 2 gig file it would be an attractive solution.

Hope this helps someone....
Brian

*************

CLOSE ALL
VarSum=0
VarWA=0

fieldnum1=9 &&balance
fieldnum2=27 &&score


?TIME()
lnhandle= fopen(&quot;commadelimtext.txt&quot;)
do while not Feof(lnHandle)
lcString = fgets(lnHandle,1000)
FieldValue1=VAL(SUBSTR(lcString,AT(&quot;^&quot;,lcString,fieldnum1)+1,(AT(&quot;^&quot;,lcString,fieldnum1+1)-1)-(AT(&quot;^&quot;,lcString,fieldnum1))))
FieldValue2=VAL(SUBSTR(lcString,AT(&quot;^&quot;,lcString,fieldnum2)+1,(AT(&quot;^&quot;,lcString,fieldnum2+1)-1)-(AT(&quot;^&quot;,lcString,fieldnum2))))
VarSum=VarSum+(FieldValue1*FieldValue2)
VarWA=VarWA+FieldValue1
enddo
=fclose(lnHandle)
?TRANSFORM(VarSum,'$$$,$$$,$$$,$$$,$$$.99') && sum of balance
?VarWA/VarSum && weighted average score
?TIME()
 
The other forms are elegant, but miss the question.

How do you get 257 fields.
Option 1, stack fields a method used by SBT Accounting (It has been a decade since I've worked with them.) Example
Six interger fields could be stacked in a character field 60 bytes wide.
cVal=str(I1)+str(I2)+str(I3) etc.
this takes a lot of code and is clumsey.

The option I went for on a hospital form requiring some 800 odd fields was.

Table1 with 200 fields
Table2 with 200 fields
Table3 with 200 fields
Table4 with 200 Fields

Table 1 had key fields. This was before database containers, so now, I would use table1 as the main table, and store proceedures so that append, appends a record to all, Delete deletes the corresponding record in all etc.

Reguarding Set Relation ..
I'm uncomfortable indexing on Recno() and setting the relation to that, so I would put a key field in each of the four tables and check that they are matching every step of the way.

I hope that this helps.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top