Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Fastest way to display a report with 1200 products

Status
Not open for further replies.

SatishPutcha

Programmer
Mar 12, 2002
256
0
0
IN
Hello Everybody

I am creating a report(using ASP only) of products and their quantities. There are 1200-odd products to be displayed.

As of now the report takes around 15-20 seconds to load.

I also need to calculate the quantities on Customer Orders(ordered but not yet delivered).

The SQL tables are:
tblProducts->ProductID, ProductName, Store wise quantities
tblOrderDetails->OrderID, ProductID, QuantityOrdered
tblItemsDelivered->OrderID, ProductID, QuantityDelivered

So I should show the (SUM(QuantityOrdered) - SUM(QuantityDelivered) for a Product besides other Product Name, Store-wise quantities.

To allow Response.Flush I am creating a <table> for every 50 products. I dont think I can use paging since the users would like to print it all out.

So, my question is what would be the fastest way to display these 1200-odd products and their quantities all on one page? Can it be done in say 5 secs or less?

Any help will be greatly helpful:)

~Satish Kumar

 
With just ASP? Not going to be easy.

If you create a SQL View that contains only the data you need from each table then it's easy as pie assuming you can create the SQL Script. This might be a pain if you are doing all calculations within the current ASP Code and are not proficient with SQL Server.

I know that might not have helped, but with ASP it's going to take time to proces that much data no matter what.

I could, however, be wrong in thinking that.
 
Thanks for your reply.

Yes I figured out too that ASP might not be the best way to do it and that could in fact be the problem with the speed.

I will look into the SQL script and see if I can do something from it.

~Satish

 
I could, however, be wrong in thinking that.
Yep.

Heret is about 4000 records that are a subset of 23,000 records from an MS ACCESS database. Server processed it in about half a second, unfortunatly due to the fact that I am flushing way too often, my system is low on RAM, etc it took my browser 13 seconds to render the insanely huge table.


1) Don't recordset.open, use your connection object to Connection.Execute your SQl statement into a new recordset
Code:
Dim conn, rs
Set conn = Server.CreateObject("ADODb.Connection")
conn.Open "conneciton string"
Set rs = conn.Execute "Select field, field, etc From MyTable etc"

2) name your fields in your sql statement, it's not worth much performance wise, but it helps. Wen you do a select * the database hgas to do a prequery on everything in your FROM to determine the field names

3) Use .getRows() to convert your recordset into an array. Operating on the array instead of the recordset has reduced overhead by 7-11 times in pastbenchmarks we have done here in the forums. GetString() is supposedly even better, but I haven't done any tests on it and prefer to get rid of recordsets as fast as I can

3.5) get rid of your recordset as soon as you can, it's just wasting memory
Set rs = Nothing

4) Reduce operations in your loop. Try to limit the number of concatenations, function calls, etc you do. Concatenations are expensive due to VBScript's horrible memory management.

5) Reduce the number of flushes you do. When you flush prematurely the server ends up having to send more partial packets to the end user instead of only havethe one last one when it does the whole buffer at once - again, a small savings but a saving nonetheless

6) multiple tables may help quite a bit. Once major cost is browser rendering time. As I showed withmy link above, even if the server spits the data out fast, sometimes it takes a while longer for the browser to manage to render it.

7) There is a thread somewhere in the forums that talks about more optimization techniques, find it :)

thats all I have time for at the moment, I have to go shower so I can get to work. hopefully this will be enough leads for now. If you follow only one of these, do #3, it will give you the biggest savings.

-T

signature.png
 
1 ) do all the data manipulation using stored procedures or stored queries(in case of Access)
2) paginate your recordset and then create a link for printing records where you can show all the records where users can print..

i mean create two pages one with pagination function and the other without for printing purposes...

-DNG
 
Tarwn,

thanks for those great suggestions. I am using disconnected recordsets and that brought down the processing time down a hell lot(Imagine I screwed up the code so much it was taking more than 3 mins to load!!). I will look at GetRows().

DNG,
thanks for your suggestions. Yes I have not yet used Stored Procedures since I felt my delay was in the ASP processing time and an SP might not help much. Of course as you said, finally if nothing else works, I will create pagination but print the recordset out in one page when they click a "print" button on the screen.

~Satish

 
Yeah Tarwn's straight on the money (as usual)

Getstring would be faster, except you have little control over how to manipulate it.

Getrows is a better choice IMHO Using getrows closes the recordset pretty fast. Basically you go in, snag the data into an array, close the recordset then display it as you want.

Here is a link as to why use getrows

and a non-numbered getrows (easier to use in some cases)





"Ever stop to think, and forget to start again?"

Stuart
A+, Net+, Security+, MCP
 
I didn't see it mentioned, but you can also put your code in 100% Vbscript so it doesn't have to turn on/off the ASP engine.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top