In terms of what? index size? relevant results? traffic they can deliver? and because it's a constantly changing world and what is correct can change frequently.
The underlying technology is fairly simple a piece of software (the crawler\spider) visits a webpage and reads (HTTP GET) the html stream served up and stores it in a database then moves on. Some will be following the existing URLs in the db to check and refresh the index, other crawlers will follow the links found on the page and add these to the db.
A separate program (the analyser) will analyse the pages marked as new or changed in the db and assign weighting scores to certain words according to various factors it finds within the html code.
The search code will then take the word or phrase from the input and locate the pages that match the phrase and display them on as SERPs in the order according to the weighting scores and that particular engines algorithm.
The algos and weighting factors used by the engines are closed guarded secrets and can change quite often (sometimes every month at Google)
two of the best SEO forums around to do more research are;
these two quite possibly have the best ethical SEOs around posting theories and ideas.
Chris.
Indifference will be the downfall of mankind, but who cares?