Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Nested loops, efficiency..... 2

Status
Not open for further replies.

snowboardr

Programmer
Feb 22, 2002
1,401
PH
Okay I read that its a very bad idea to do nested loops in ASP, i am wondering if there is a better solution.

What i am doing is basicly i have 20 main categories and approx 5-30 sub categories underneath each main category, and then underneath those sub categories are some random records... which will only be up to 10 for each sub category. So bascily its:

Main Category
Sub Category
Records


The thing of it is, I am actually using ajax to open up all the main categories, sub categories and records, however, I save if its opened or not, thus i have to output the page how it was saved.. if they refresh etc... So what do you think the best method would be to do this?

I am using MySQL database.

Jason

Army : Combat Engineer : 21B

 
I personally use nested loops all the time.

You can also use getrows to speed things up quite a bit.

Maybe XML to compile all of the records in? Just a thought as I know very little of XML, just everything I am reading on AJAX the two seem like they would play together nicely.

Stuart
 
snowboardr,

First, nested loops aren't necessarily bad.
Second, nested loops and/or recursive calls to a database usually IS.

Personally I would cache the database records inside an XML document within the Application object.

Then write a simple xslt doc to transform the XML (you can do it in straight ASP if you're not familiar with XSLT) into XHTML within your page. With XSLT and XML and ASP being separate you separate out presentation from data from business logic.. which is almost always a good thing.

Using CSS and Javascript you can collapse / expand the nodes of the tree (categories, subcategories etc) as you see fit. If you use Javascript to capture the node path id (e.g. //Cat1/SubCat2) and its state inside a cookie (or send to the server if you want full persistence) then you can use that to set the state of the tree upon each load.

You can be more fancy than that and load on demand via AJAX - where your server side component provides sub content (1 level) for each call for a node, that can be passed as an XPath string which will make the server side filtering a doddle. Again, the state can be stored on the server, so the first call always returns the last known state (e.g. level of information).

Advantage of doing it clientside is the reduction in server hits to answer trivial questions about data that is likely to be relatively static and already sent to the browser at some point.



A smile is worth a thousand kind words. So smile, it's easy! :)
 
20 x 30 x 10 = 6000

Nested loops are neither good not bad in any language. The problem is that an operation in the inner most loop can be executed an awful number of times as my opening computation shows.

Typically in ASP on web pages with recordsets we are displaying lines of data. Ya gotta ask yourself whether ya really wish to display 6000 lines of data. I guarantee no one is going to look beyond the twentieth line.
 
schase
I have always had problems with getrows() never could get it to work... and never bothered to figure out why...


damber
I was thinking about using XML, but honestly I have approx 8 days until i am shipped out to the military to get this finished, and online for while I am gone, and I know how i am if i don't finish a project and put it off for a while... especially programming... so I need to get it taken care of soon, I don't have time to figure out working with xml notes etc...


rac2
So you are telling me that you do not think people are going to come the front page login, and open up every category and then continually come back and hit the server with all the categories open? Because there are only 20 main categories... but I have noticed a lag of approx .300 to .500 seconds to load the page with them all open... and that is without the third loop.. Which could be up to 10 records per sub category. I am really worried about efficiency on this one... and i know i didn't pick the best language for that.. but like i said im on a major time crunch.

Army : Combat Engineer : 21B

 
Getrows will significantly speed up that .3 to .5 second lag time. to quote learnasp.com

If there are 700 records and 3 columns for example, we have 3,500 database read requests over the wire.

+2,100 ... Each field read is a request
+700 ..... Each .movenext is +1 request
+700 ..... Each .eof test is +1 request
====
3,500 requests

Vs Getrows - 1 transfer into a 2100 element array - manipulate the data later.

Easiest method is to use getrows not by the numbers - as shown here


Stuart
 
rac2,

the OP's statement was that nested loops are bad... they're not - but, as with any feature in programming, they can be abused...
[tt]for i = 0 to 99999999999999[/tt]
or
[tt]while true[/tt]
or
[tt]for i = 0 to 10
i = i -1
next[/tt]
and so on... brandishing these things as bad in themselves is spreading FUD, hence why I corrected the OP.

There are circumstances where evaluating 6000 iterations is 'necessary'.. though usually this can be alleviated by better design of other components of the solution.

But the issue at hand is a poor use of loops..

snowboardr,
I don't have time to figure out working with xml

Then output the html code (e.g. an unordered list) inside a string and cache that - when you load the page just write it out and deal with the state client side.. it will be a LOT faster.... BUT it will also be a little 'jerky' (the js will act after the page has loaded so showing the items will happened after load, and not before) - it's not very elegant, but quick and dirty is sometimes a necessity and what you are asking for. When you update the database with new categories etc - output the html code again to memory/cache. Simple. Doing it the right way may seem like a chore, but it will give you lots of benefits.. up to you. You can do the state server side, but you get into the same issues with performance that you will always get when using nested loops in a database.. trust me... memory is far faster than any FS based Database..

schase,

Appreciate your sentiment, you're right in that Getrows is definitely faster than normal recordsets, however.. a couple of things:

1. There are not 3500 separate read requests of the database 'over the wire' in your example, there are at most 1400 - the default cache size for ADO in classic ASP is 1 record (and can be set higher to speed this latency-bound part of the process up) - it will retrieve the entire record in that request - you deal with that cached version (of however many records in the cache size) until you request .resync. (the link you provide does say that he fudged the numbers to make a point and that changing the cachesize to 50 would reduce the reads to just 14). The other (bigger) problem with a recordset is the overhead in creation of a recordset object and all the sub-collections/objects, such as fields etc.. it's a reasonably large collection full of lots of metadata to provide functionality such as transactions... which all takes time to build.

2. This example is not using large data sets.. (20-30 is tiny) so the real gains with getrows will be minimal (but still measurable by a computer)

You can speed up beyond GetRows if you have a defined string pattern by using GetString. It is much faster than iterating over an array in vbscript.
e.g.:
In fact, even faster (especially for large datasets) is creating XML from a recordset using GetString and using XSLT to transform the set of data.. it is very powerful at looping.. far more so than vbscript - which you have to do at some point if you want to output the set of data in the array or recordset..

And to speed things up even more, you should use a fast string concatenation class.. string concats in vbscript become exponentially slower the more you add to the variable.. so iterating in a large loop and concatenating to a single string variable is not good...

A good list to help with performance issues:

and Microsoft's:


A smile is worth a thousand kind words. So smile, it's easy! :)
 

You're welcome snowboardr,...
Glad I spent the time to help you..


A smile is worth a thousand kind words. So smile, it's easy! :)
 
Hey thanks Damber... I dont know what id do without the people at tek-tips... :D

[noevil]

Army : Combat Engineer : 21B

 
good info, I hadn't been to aspfaq in quite awhile. GetString is something that I'd never quite been able to spend enough time on to fully grasp it - I know it is supposed to be even faster than getrows.

And I tend to recommend getrows - even for small recordsets - if it's anything like the stuff i've done, 20 loops consisting of 10 loops each of 10-30 items will grow over time. I hate to recode a year or so down the road to try and speed things up to accomodate the growth.

That speed info is a heckova good explanation - one of the better that i've seen - you outta FAQ it here.

Stuart
 
Hey one more question regarding getrows do you think its okay to use that for posts as well? Because the content of a post could be 20-30 records per page... i am just thinking they will be pretty big strings in the memory... But I do have a server basicly to myself...

AMD Athlon XP 2200 1GB RAM

Jason

Jason

[red]Army[/red] : [white]Combat Engineer[/white] : [blue]21B[/blue]

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top