Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Unnecessary Calculations? Extensive Recalc Time?

Status
Not open for further replies.
Mar 16, 2001
11
0
0
GB
I am interested to learn if anybody else has suffered the following problem...

BOb seems to recalculate my reports when a recalculation is not apparently necessary. For example, BOb will recalc each time I move between the 'pages' in my reports. Why? No data has changed, and surely BOb can remember what the page looked like when I clicked off it?

BOb also recalcs when I copy, paste or move embedded objects such as report titles or graphics. Why?!
I apply a format to an array of cells and off it goes again on another recalc!

This problem is amplified by the amount of time BOb takes to perform the recalc. For some of my larger reports (250,000+ records), the recalc time can be upwards of 30 minutes! Is this normal?
I have been successful in reducing this time by indexing the data within the universe and amending some of the table links, but this time still seems excessive. Certainly more time than Discover takes to perform a similar query.

I am running BOb version 5.0.2 on 128Mb RAM 750mHz CPU.

Any help would be greatly appreciated!
 
Well Phil I can certainly sympothise with your problems, i have had serious problems when building reports with lots of rows. The only explaination i have for this is becasue BOB dos not store the data you see it stores formulae, so when you change tab for example it needs to calculate all the agregation and variables. I can however suggest the following to minimise pain
1. Avoid linking data providers where possible, use multi path sql instead.
2. Minimise the number of rows in the result set by using aggregate functions in the universe e.g Sum (Qty) rather than Qty as an object definition
3. Remove unused variables
4. Use the report structure when formatting the report/adding new variables, it doesnt recalc in this view
5. Return a subset of data when developing.
6. 250,000 rows is a hell of alot of rows. Is this really a useful report ?
 
Many thanks for your swift response Bob. I'll certainly be trying out your suggestions! I think number 4 may be particularly helpful...

Re: point 6, Awkwardly, many of the reports I produce compare things like sales activity over a number of years. So although my report will only actually display monthly and annual totals with comparisons, I find that the data cube must contain between 2 and 4 years worth of data. Doh!

That is unless anyone knows of a trick to get around this of course?!

Many thanks again Bob.
 
Would it not be worth creating a view or snapshot on the database that sumarises the data for the previous 2 to 4 years of data thus getting the DB to do the hard work, letting BO work with a smaller result set

JD
 
That is a good suggestion John, but would I not lose any drill hierachy on the report is if was generated on a snapshot of the database?

My reports are based on airline data, and allow the user to view the report initially at a route overview level, and then drill down through various levels for instance, to view the travel agencies that booked the routes or the passengers that travelled the routes.

Perhaps I'm 'pushing the BOb envelope' here?!
 
How many users require drill down on information ? I have a possible solution in mind for you. Is this kind of analysis required on other reports ? How many BO users do you have ? Do you want to give the information to more people ? Where are you based ?
 
Hi Paul,

Thanks for your interest, let me fill you in on my situation.

I produce warious MI data and reports for the UK based regional airline that I work for. Currently, our EIS is in the process of being updated and BOb is the reporting tool we have chosen as a part of this.

Therefore, the number of BOb users within our organisation is low at present, as Discoverer 2000 was our previously prefered query tool.
Because of this, none of the reports I am currently producing make use of drill analysis, but I have tried to build the reports in such a way that as our number of BOb users grows, I can develop the reports to make effective use of drill analysis. Around 50 of our staff are expected to use BOb eventually.

Potentially, most of the reports I produce would benefit greatly from drill capability, but I am deeply concerned about the amount of time my reports are taking to run, refresh and compute. It's an issue that I am trying to address and gain some clarity on before I move forward onto further analysis techniques such as drill.

At this early stage of my organisations use of BOb, any information or advice on how I may be able to reduce the amount of time BOb takes to churn things out would at the very least save me hours each day, which are currently spent waiting for results!

If you would like to contact me directly you can reach me on phil.husbands@british-european.com
Meanwhile, many thanks again for your response, I'm curious to know what kind of solution you have in mind?!

Phil.
 
I have found that a particular cause of reports running slowly is complex filter statments i.e. when you use the "Define" button in the Filter box. I'm not sure if this is relevant to your situation but thought it worth mentioning.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top