Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

query counter

Status
Not open for further replies.

katchinska

Programmer
Feb 16, 2008
2
US
I want to see what query is ran the most in a given day so I know which ones are most important to optimize. I would prefer to do this without downloading any patches or installing third party software. The query profiler will let you see all of the queries that have been ran but it does not sum up one single query. You would have to go through by hand and count each instance. So if there is anyone out there who has had this problem let me know.
 
I am not really familiar with this query profiler, but what kind of output does it give you??
From what you say it seems to me you could get somewhere by sorting/grepping your way through it, or not...?!?
 
Interesting one. Your equating the query that runs most as the one to optimise. If that query (for example) get one row from a table with a unique key you don't have much scope to get it faster and if you could it would take you a long time vand maybee a table redesign.
Whast I think you need to do is identify which queries do take a long time and from those see which are uses the most and then look at them. Remember of you have a query that takes 5 seconds and is run twice a day then that is probabbly not worth looking at.
Final thing beware adding any new indexes to speed up some queries, that might slow down some other updates and inserts.
 
Hey MacTommy, since you havent heard of it ill give a little run down on it.

mysql> set profiling=1;
Query OK, 0 rows affected (0.00 sec)

//that command starts it up then you run a query any query and then type in

mysql> show profiles;

//that command brings up a table that looks like the below one

+-----------+------------+-----------------------------+
| Query_ID | Duration | Query |
+-----------+------------+-----------------------------+
| 0 | 0.00007300 | setprofiling=1 |
| 1 | 0.00044700 | select * from client |
+-----------+------------+-----------------------------+
2 rows in set (0.00 sec)

//See it has the first thing you typed in in the table and it also has the query you ran. Now each query gets a uniqe id and if you keep going the table gets real big but I tride to run a query on this table "select count(Query) from information_schema.profiling where Query = 'select * from client';" but that didnt work. But you can query the table you can write this
"mysql> select sum(duration) from information_schema.profiling where query_id=1;"

Hey I just thought of a solution. What if I used this table to see what queries were ran alot and then went and put triggers on those queries. The triggers would increment a count for that query in a, will call it a frequently run query table. Would that be a dumb idea?

 
OK, thanks for the explanation.

And what if you dump the output of this show profiles to a file and do a 'sort | uniq -c' on the Query column..??
That would give you every single query and the number of times it ran.
Or isn't that what you want?!?

Or you could also make a Perl oneliner that gives you all the queries, the amount of times they ran, and the total time they took individually.

Suppose you got a file (called MySQL_profilingDump.txt) that looks like this:
[tt]
+----------+------------+----------------+
| Query_ID | Duration | Query |
+----------+------------+----------------+
| 4 | 0.00180200 | show tables |
| 5 | 0.00131350 | show tables |
| 6 | 0.00124850 | show tables |
| 7 | 0.00124725 | show tables |
| 8 | 0.00115375 | show databases |
| 9 | 0.00129925 | show databases |
| 10 | 0.00125400 | show databases |
| 11 | 0.00116425 | show databases |
| 12 | 0.00119300 | show databases |
| 13 | 0.00135775 | show databases |
| 14 | 0.00115750 | show databases |
| 15 | 0.00115675 | show databases |
| 16 | 0.00116475 | show databases |
| 17 | 0.00157750 | show databases |
| 18 | 0.00123000 | show databases |
+----------+------------+----------------+
[/tt]
Then you could say:
[tt]
> perl -ne "chomp; $hNrOfCalls{(split /\ \|\ ?/ )[2]}++; $hTimes{(split /\ \|\ ?/ )[2]} += (split /\ \|\ ?/ )[1] if(/\d/); END { while( ($sQuery, $fTime)= each %hTimes) {print \"$sQuery took $fTime seconds (called $hNrOfCalls{$sQuery} times)\n\";} }" MySQL_profilingDump.txt
[/tt]
Maybe not the most eloquent piece of Perl ;-), but it does give you this:
[tt]
show databases took 0.0137085 seconds (called 11 times)
show tables took 0.00561125 seconds (called 4 times)
[/tt]
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top