Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

what to use analyze or dbms_utility

Status
Not open for further replies.

bookouri

IS-IT--Management
Feb 23, 2000
1,464
0
0
US
Can anybody tell me if its recommended now to continue to analyze 9i and 10g databases or should we be switching to dbms_utility.analyze_database. Some of the info Im finding is contradictory. Some recommend to switch to the new way and some say the new way causes trouble because it analyze's sys's schema. But most of the stuff Im finding is several years old. Is there a current preferred way to get analyze done?

thanks for any comments
 
Bookouri,

In Oracle 8i, you had to DELETE STATISTICS for the SYS schema. For Oracle 9i and newer, the statistics-gathering packages do not gather stats for the SYS schema.

I, personally, use this code to gather my statistics:
Code:
SQL> exec dbms_utility.analyze_schema('<schema name>','ESTIMATE',estimate_percent => 10)
Estimating statistics is remarkably accurate and far less CPU intensive than the COMPUTE option (which reads every record in the tables).

You are also supposed to be able to gather statistics with:
Code:
exec dbms_stats.gather_database_stats(cascade=>true);
...but I have had trouble with that method. (Sorry, I cannot remember what trouble I had.) So I opted for the highly successful schema-by-schema method for gathering stats.

[santa]Mufasa
(aka Dave of Sandy, Utah, USA)
[I provide low-cost, remote Database Administration services: www.dasages.com]
 
Thanks, thats what i was wondering about. Some sources stated that dbms_utility would analyze sys's tables and some said the opposite. Im guessing some of the sources I was reading were old and referring to 8i. But most of the more recent ones seem to recommend the dbms_stats.gather... format.. but there doesnt seem to be any consensus. I put a lot more confidence in what you guys recommend than those random sources.

 
Hi,

on live DBs I usually do the following:
1) Turn on monitoring on all tables of the application schema (unless I know a table doesn't have to be analyzed).
You can generate the code for that with following statement:
Code:
SELECT 'alter table '||table_name||' monitoring;'
  FROM dba_tables
 WHERE owner = '&schema_owner'
2) Have a job call dbms_stats.gather_schema_stats with estimate_percent => NULL (=compute) cascade => TRUE and options => 'GAHTER STALE'.

As only those tables and indexes are analyzed that really need fresh statistics I use compute and still stay in the timewindow for this task.
 
thanks, ive never looked at monitoring. I guess Ill have to look that up...

 
Can somebody give me an idiots guide to what monitoring actually does. The documents I've found havnt really explained monitoring so that i can understand what it will do for me and how it relates to analyzed tables, etc..

thanks
 
Simplified monitoring keeps track of inserts, updates, deletes and truncates on the monitored table. You can query that information using the <user|all|dba>_tab_modifications view. When a certain percentage of the table data has been modified and you use the gather stale option of dbms_stats.gather_<database|schema>_stats the table will be analyzed.
 
So the monitoring works to set things up so that the dbms_stats.gather is more efficient when it runs? or do you manually use the info from the dba_tab_modifications to know when to run the dbms_stats? does the monitoring "cost" anything as far as performance or anything in the live database? are there any drawbacks to monitoring?

 
Monitoring sets things up for dmbs_stats, no need to manually use the info. Nothing is for free - monitoring "costs" something for sure, but the impact should be minimal (was nearly not measurable when I evaluated).
If there are any drawbacks I didn't encounter them until now and I use monitoring for some years now.

But you'll have to test if this method fits your needs. Applications tend to handle data different. If all your tables would have to be reanalyzed every run of dbms_stats there would be no benefit of monitoring and I wouldn't use it.

You may use the LIST STALE option of gather_schema_stats to find out what tables would be analyzed.

Also there is the option GATHER AUTO (and LIST AUTO) - but I don't have experience with that one.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top