Gary Sutherland
Programmer
Large CSV files is a topic that has cropped up before, I know, but I can't find a reference to this particular problem.
I'm writing a tool import address data from the UK PAF.CSV postcodes/addresses file. This is a CSV file containing (at present) 31,827,747 individual UK postal addresses. I'm opening the file with FOPEN and reading each line with FGETS.
To get around the maximum size limit for a DBF I'm writing these to a set of ten identical tables in blocks of 3,500,000 records which is working well. The problem I'm encountering is when it reaches line 25,214,532 the FGETS generates an error that the file is too big. This is when it's already 714,531 to table 8 which should still have room for another 2,785,468 records. Remember there's still 7,000,000 capacity in tables 9 and 10.
So, this appears to be a limitation of the FOPEN/FGETS functionality in VFP.
Short of using something to split the PAF.CSV file into two files and importing them sequentially, I'm open to suggestions.
Regards
Gary
I'm writing a tool import address data from the UK PAF.CSV postcodes/addresses file. This is a CSV file containing (at present) 31,827,747 individual UK postal addresses. I'm opening the file with FOPEN and reading each line with FGETS.
To get around the maximum size limit for a DBF I'm writing these to a set of ten identical tables in blocks of 3,500,000 records which is working well. The problem I'm encountering is when it reaches line 25,214,532 the FGETS generates an error that the file is too big. This is when it's already 714,531 to table 8 which should still have room for another 2,785,468 records. Remember there's still 7,000,000 capacity in tables 9 and 10.
So, this appears to be a limitation of the FOPEN/FGETS functionality in VFP.
Short of using something to split the PAF.CSV file into two files and importing them sequentially, I'm open to suggestions.
Regards
Gary