I am trying to read the contents of a huge file [could be up to as much as 500mb!!!!] and then load it into mysql.
My idea was to call a page that reads a section of the data, loads it and then calls itself again to load the next section of data, and so on until the whole file has been loaded. Does that sound feasible???
I have read on sites that this sort of thing can be done but am yet to find an example!
My current code [which does work but after a time just gives up as it takes too long - which is what I would expect!!] is as follows:
set_time_limit(0);
@$fp = fopen($doc_name,"rb"
...test file is ok, etc...
while (!feof($fp))
{
$line = fgets($fp, filesize($fp));
...handle the string and do the mysql bit...
}
fclose($fp);
Thanks for any help
My idea was to call a page that reads a section of the data, loads it and then calls itself again to load the next section of data, and so on until the whole file has been loaded. Does that sound feasible???
I have read on sites that this sort of thing can be done but am yet to find an example!
My current code [which does work but after a time just gives up as it takes too long - which is what I would expect!!] is as follows:
set_time_limit(0);
@$fp = fopen($doc_name,"rb"
...test file is ok, etc...
while (!feof($fp))
{
$line = fgets($fp, filesize($fp));
...handle the string and do the mysql bit...
}
fclose($fp);
Thanks for any help