Hello all
I have a couple of reports which pull off data from a catlog designed using impromptu 7.2 .These reports open up too slow.Are there any optimization techniques which i can follow to improve the speed.These reports are complex.any insight
neil
Do they always open up too slow? or just at a certain time?
Are you using Sybase? Oracle? or some other database? The new version seems to open up the same as before for us and we have over 100 reports to open. Did you download 7.1.529 MR2?
CP
The version is 7.1.339 .These reports never open up if i try to get all the data.but i just enter the right parmeters to open them up.are there any optimization techniques
neil
Can you tell us where the bottle neck is?
1)at opening of the report.
2)generating the sql for the report
3)execution of sql
4)retreival of information from database
5)display of data on the
Here are some crude ways of telling above:
1) how big is the imr?
2) go to Query Data -> Profile Tab. Is the SQL huge?
3) go to Query Data -> Profile Tab. What's the "Query execution time?"
4) how big (how many rows) is the result size?
5) is there a lot of custom formatting in the report?
Download the latest version, and try them with that to see if there is a difference.
Did you make any changes to your .ini files prior to this install? If so, can you find out what they were? Anytime you do a re-install of anything, always save the cogdmor.ini and the cognos.ini and the impromptu.ini file for past reference, because some of these are altered with new installs. Changes may have been made and now you need those with your new install possibly. ? Need more info.
gasman
well about the number of rows and query execution time ,the profile says no statistics available.yes there are a lot of custom calculations in the report.The size of the report is also very big running in hundreds of thousands of records.
Thanks
neil
Neil
There are probably a number of options that can improve the performance of the report, but it will probably require a lot more Q&A to understand your reporting environment.
If you think the performance problems is because of a lot of custom calcs, I can think of a couple of options:
1) you can try to move some of the calculation work into the database by building pre-calculated tables or views, thus moving some of the workload to the database.
2) you can use Impromptu on a bigger machine. (Some of the complex report I build are done on a server-class machine that I remote into.)
If you think it's taking long b/c of extremely large result sets, you can go with option 2 above. Also make sure "Retrieve only the required rows" is set in the "Access" tab of Query Data.
Finally, if you notice a change in performance from one version of Impromptu to another, I agree with CP. The reason could be that you have optimized ini files for your old version of impromptu that need to used for your new version. Additionally, make sure to check the SQL that you have from when it was running faster. Compare that to the SQL that you are getting now with the slower version. My bet is that it would not be the same.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.