Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Searching large sites 1

Status
Not open for further replies.

lesm

Technical User
Jun 11, 2002
6
PE
I'm not an expert PHP programmer.
I need to search all files in a large site, 10,000+ files in some 300 hundred dirs. Of course the problem is the 30 seconds Timeout. I can use the Sleep() function but prefer a better solution. Since I'm not expert, I need some ideas, in order to get the best approach.

Any help? Thank you.
 
Did you try google?
lol.. nah, sorry.. I dont know how this would be best solved. I think it will give a great deal of system stress!

I think maybe you have to make a function that redirects and runs functions, based on an array or something.

Olav Alexander Mjelde
Admin & Webmaster
 
why not have some kind of indexing system that indexes only 100 files at a time. it will take a long time but once that is done then your search can be handled using a database...

Known is handfull, Unknown is worldfull
 
I can't figure out how an array might help.
Indexing may not help much, since the strings to search (i.e. "cccc.nnnnnnnnnn.ccccc", or "cc.nnnn.nnnnnn.ccc") change very often. Since it is an Intranet system I can't post an URL.

What I'm trying to figure out is how to stop the script execution to avoid the timeout and without flushing the results. Is it possible? Storing in a file, reloading and continuing seems to be very cumbersome.





 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top