Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Reading a Huge File

Status
Not open for further replies.

RobHudson

Programmer
Apr 30, 2001
172
GB
I am trying to read the contents of a huge file [could be up to as much as 500mb!!!!] and then load it into mysql.

My idea was to call a page that reads a section of the data, loads it and then calls itself again to load the next section of data, and so on until the whole file has been loaded. Does that sound feasible???

I have read on sites that this sort of thing can be done but am yet to find an example!

My current code [which does work but after a time just gives up as it takes too long - which is what I would expect!!] is as follows:

set_time_limit(0);
@$fp = fopen($doc_name,"rb");
...test file is ok, etc...
while (!feof($fp))
{
$line = fgets($fp, filesize($fp));
...handle the string and do the mysql bit...
}
fclose($fp);

Thanks for any help :)
 
I don't know what format your data is in, but have you thought about doing it in two phases, one to process the data into a specific format, and the other to load the data?

If you pre-process the data then use MySQL's LOAD DATA command, you might have better success.

______________________________________________________________________
TANSTAAFL!
 
Thank you for that. That could turn out to be useful :)

The file is a fixed length record style. If I was to pre-process the file then I would still be faced with the script/server timing out as it takes so long to go through the file...

Is it possible to read the file in chunks??

Thanks
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top