maxhugen . . .
Understand . . . [blue]I'm not trying to dispell anything you've presented here . . .
Your Right![/blue] However if you understand my point (at least for me) . . . wether or not to use aggregates these days is truly dependent on processor speed. Where my code would be too slow at 200Mhz, no problem at 2GHz or higher . . . which I expect is the average speed today. So I always design code with this in mind.
My problem is getting into super formuli that can esaily exhaust query optimization. I believe it should all be easy . . . so much so that if I spend more than a half hour on a query I automatically move to function to make things easier on myself. Tagging the item for cleanup and testing with large recordsets . . . which I do for each DB.
It may not be readily realized as were all focused on ouir tasks, but [blue]processor speed has opened a great many doors that use to be closed![/blue] . . . So why not use this speed to our advantage!
My benckmark is done with an old piece of hardware thats programmed to detect the function and literally single step the processor, removing its processor cycle calls, hence the final result. The problem is it only accurate down to tens of milliseconds which is why it can't be used with the speed of today's processors. The resolution of speed is simply outside its window.
So as we both know, normally (if this were years ago) [blue]DMax is a slow turtle item.[/blue] But with todays speed . . . neary a problem. [green]I've simply learned to let processor speed work for me as well.[/green] If the user apprises me they have slow machines then of course . . . faster code is required!
I've learned a great many ways to make things easier for myself as above, keeping in mind I may have to optimize to faster code. As before . . . these items are tagged for cleanup!
[blue]Your Thoughts? . . .[/blue]
See Ya! . . . . . .
Be sure to see FAQ219-2884: