Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Highlight record with the highest value 1

Status
Not open for further replies.

qwerty70

Technical User
Dec 13, 2005
73
GB
Dear All,

My inquiry is relevant to thread702-1381356.

In my report, is there any way to highlight a record that contains the highest revision number? As you can see per my sample data below, the revision number field have 2 different value formats; alpha-numeric and number. Therefore, the highest revision should be "3E". Please help.

txtDrwgNum txtRevNum
20-SD-2001 1A
20-SD-2001 2F
20-SD-2001 3
20-SD-2001 3E

Thanks in advance,

qwerty70
 
Woops . . . hit submit too soon . . .

Bear in mind the jet optimizes running of a query, [purple]it can't optimize a function![/purple] That is to say a functions runs its course (# of processor cycles) wether that function is in VBA or the SQL of a query.

You use the [blue]Max[/blue] function which is a form of aggregate and and IIf of which both answers are always evaluated (this is the way IIf performs).

Although I use DMax If you take a look at the criteria you'll see a very small set of records are involved (specifically only records for that drawing#). So the impact of DMax won't be seen.

Calvin.gif
See Ya! . . . . . .

Be sure to see FAQ219-2884:
 
The difference, as I understand, is that JET evals inbuilt functions such as Max() itself. However, it can't do that with custom functions, which is why custom functions are particularly slow. This is exacerbated if using a BE and FE, as (I'm guessing) that data is repeated shuffled back and forth from BE to FE (for processing) and back to the BE.

As for the DMax(), even though it's evaluating only a small subset each time it runs, it nevertheless must run for every record returned by the report. There could be 1000s of drawings.

The domain functions are well known for performance issues, especially on linked tables.

Max Hugen
Australia
 
maxhugen . . .

I've long known about the performance issues you speak of and am a reluctant user of DMax. However I did test the code BE/FE and results were respectable. Then I tested on an old machine of mine I keep around for benchmarking (200Mhz processor) and yes your code won the race. On a 2Ghz machine you can't tell the difference . . . Bear in mind this is with a test table of 100K records.

BTW: I can't see an aggregate function like Max being evalued since its arguement is open to any expression that will return a name.

In any case . . . why not run your own benchmark and let me know . . .

Calvin.gif
See Ya! . . . . . .

Be sure to see FAQ219-2884:
 
I haven't found any really satisfactory method of benchmarking. I use timing functions which, at the end of the day, are dependent on many variables: network speed/congestion, users' workstation specs, what they're running and so on. Even in my own dev environment it can vary sometimes by quite a lot.

What I can say is that for 2 clients, each who have around 20 concurrent users (!) in a networked environment, I spent a lot of time looking for application bottlenecks, and they both ran quite well considering the number of users, and in their case, rather overloaded networks. These were both apps that I inherited.

I have since managed to port one of the apps to the Internet, but the other client suffers from 'large corporation IT fiefdom syndrome' - they get little response or real help from IT. So for now at least they're stuck with a legacy A97 app. Still, after extensive reworking of the app - almost a complete rewrite - the users are tickled pink by the improved interface and performance.

I too have tried custom functions in queries, but perhaps because of the query complexity, I found that doing this was a major performance blow.

Again, in the same apps, I got rid of numerous domain functions from some reports, also with somewhat involved queries, and the speed improvements were substantial... eg one report that took several minutes now took something like 20s.

I don't have the facilities to run any decent benchmarking tests as I guess you do, so I can only go by what I've achieved in real world apps.

Max Hugen
Australia
 
maxhugen . . .

Understand . . . [blue]I'm not trying to dispell anything you've presented here . . . Your Right![/blue] However if you understand my point (at least for me) . . . wether or not to use aggregates these days is truly dependent on processor speed. Where my code would be too slow at 200Mhz, no problem at 2GHz or higher . . . which I expect is the average speed today. So I always design code with this in mind.

My problem is getting into super formuli that can esaily exhaust query optimization. I believe it should all be easy . . . so much so that if I spend more than a half hour on a query I automatically move to function to make things easier on myself. Tagging the item for cleanup and testing with large recordsets . . . which I do for each DB.

It may not be readily realized as were all focused on ouir tasks, but [blue]processor speed has opened a great many doors that use to be closed![/blue] . . . So why not use this speed to our advantage!

My benckmark is done with an old piece of hardware thats programmed to detect the function and literally single step the processor, removing its processor cycle calls, hence the final result. The problem is it only accurate down to tens of milliseconds which is why it can't be used with the speed of today's processors. The resolution of speed is simply outside its window.

So as we both know, normally (if this were years ago) [blue]DMax is a slow turtle item.[/blue] But with todays speed . . . neary a problem. [green]I've simply learned to let processor speed work for me as well.[/green] If the user apprises me they have slow machines then of course . . . faster code is required!

I've learned a great many ways to make things easier for myself as above, keeping in mind I may have to optimize to faster code. As before . . . these items are tagged for cleanup!

[blue]Your Thoughts? . . .[/blue]

Calvin.gif
See Ya! . . . . . .

Be sure to see FAQ219-2884:
 
Agreed, the new faster processors, and especially the increased RAM, gives us much greater leeway.

I write apps for some large US corporations, and it surprises me how 'ancient' lots of their pcs are. Pentium I, minimum RAM, screen resolutions 800x600... kinda like telling workers to build a house, and supplying them with hand saws, hand drills etc! Do-able, but oh boy!

And to top it off, the users struggle to get IT to check network loads etc... eg, a user opened a 10Mb image file stored on a server, and he had ample time to go get coffee before the file would open.

So I've had to break my balls trying everything I could think of, to get apps in such environments to run at a satisfactory speed. Oh, and remember those 20 odd concurrent users? Crazy.

Never let it be said that Aussies aren't up to a challenge :)

Max Hugen
Australia
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top