Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Using only parts of CGI and DBI

Status
Not open for further replies.

rpack78

Programmer
Oct 1, 2004
35
US
I have an application that uses CGI and DBI. It is very robust and performs a lot of queries and has a lot of traffic. I'm having problems with the CPU of the server staying at 100%. I read online that to speed up my pages I can use parts of CGI and DBI instead of using the whole thing. How do I do this?

The only part of CGI I really need to use is the part that gets the variables from the url string...

use CGI;
my $q = new CGI;
foreach($q->param()){
$var{$_} = $q->param($_);
}
 
they mean you showed specifically sayd what you want from these modules rather than import em whole. So...

use CGI qw:)standard);

You can read up what the exported functions/variables are for a given module (on CPAN, eg)

However, I really doubt that this is what's causing the server to stay at 100%. It'll speed up loading a page a little, yes, but not that drastically. At least not in my experience. Certainly not going to make server stay at 100% all by itself.
 
Cool thanks I'll give it a shot, but I agree with you. I don't think this is the problem. The server has more than enough horsepower to handle this many page requests. When I look in the task manager under processes, there's like 30 perl.exe's in there. If I restart the mysql service or restart IIS, they all dissapear and the server is back to normal for an hour and then starts getting bogged down again. I can't figure out what the problem is, I am destroying all of my statement handles and disconnecting from the database. The server just can't seem to keep up.

It has 2 GB RAM and a 3.4 GHz processor. There are about 100 users that are hitting the main page every 10 seconds or so. (The page automatically refreshes after 10 seconds). Any ideas?
 
How heavy are your DB queries, i.e. how long does it take to service a request for the main page? If you have 100 users hitting the page every 10 seconds on an auto-refresh, and it takes 3 seconds to connect, run the query, and render the page, then this will explain the 30 instances of perl.

Open your task manager, click on the CPU title column to sort by descending CPU usage. Let it run for a while, and check to see that your PIDs are churning over. If any seem to be hanging around for more than a few seconds, try to find out what they are doing (even something as crude as logging a message with the start/stop times, script name, and parameters for each transaction might give you a pointer). Do any of them have excessive amounts of memory allocated to them?

Who's doing all the CPU? Is it IIS, one or all of the multifarious perl instances, or MySQL?

Once you have a handle on where the resources are being consumed, you are in a much better position to pin it down further. Do you have a while loop that polls waiting for something to happen? Can you make it event-driven instead, or maybe increase the sleep time? Maybe your users are being lazy, causing a query with a dodgy LIKE clause to scan the whole table on each refresh?

Maybe you need a new/different index on the table?

Until you can identify the CPU hog, all this is just speculation - but if you get any more info, please post it, as I'm sure we can help [smile]

Steve

[small]"Every program can be reduced by one instruction, and every program has at least one bug. Therefore, any program can be reduced to one instruction which doesn't work." (Object::perlDesignPatterns)[/small]
 
if all you need is to parse the query string use this:

use CGI qw/:cgi/;

":standard" imports all the html generation functions as well as "forms" and "cgi";

And this:

Code:
use CGI;
my $q = new CGI;
foreach($q->param()){
  $var{$_} = $q->param($_);
}

can be written as:

Code:
use CGI;
my $q = new CGI;
my %vars = $q->Vars;
 
Ok, I sorted by CPU and mysqld-nt.exe is taking up to 95% of the processing power. The perl.exe instances dissapear after about 3 seconds.

I am doing anywhere from 10 to 20 queries on the main page which automatically refreshes every 10 seconds. The queries are written as efficiently as possible with no "LIKE" clauses.

Is it possible that it's just too much for the server to handle? I can't think of anything else to try besides rewriting the page in PHP.
 
But you yourself just said that perl is not the problem, and it is something within MySQL. You can rewrite your page in assembler if you want - that won't make mysqld-nt.exe faster.
On a lighter note, perhaps it's just using 95% because otherwise CPU would be idle? I mean, if it's just using available CPU bandwidth to do what it needs, that's ok. Other than being at 95%, is there noticeable, verifyable 'slowness' that you actually see/feel?
 
If MySQL is taking all the CPU, rewriting the page in PHP isn't going to make any difference, you will still be running the same queries.

Worst case scenario: 30 concurrent tasks, generating 20 queries each, each taking 3s to process = 30 * 20 / 3 = 200 database queries starting per second. Every second. Still wondering where your CPU is going?

Are they all running the same queries? Or do they have different selection criteria? If not, you could consider running some or all of the queries on some kind of asynchronous timer pop, and cacheing the results as a ready-made HTML page you can just serve up when they refresh. Then you'd only run the queries once every 10s, but it would appear that the page had been refreshed to all the end users.

Maybe moving the DB to another machine, perhaps with multiple processors so you can get some concurrency?

Also, when you are CPU bound like this, you need to think about latent demand. Your server is already going flat out, typically what happens next is that you move it to a server with twice the capacity, only to discover that it's immediately running at 80% due to the latent demand that existed before, but couldn't be serviced by the old one.

Steve

[small]"Every program can be reduced by one instruction, and every program has at least one bug. Therefore, any program can be reduced to one instruction which doesn't work." (Object::perlDesignPatterns)[/small]
 
azazzello said:
is there noticeable, verifyable 'slowness' that you actually see/feel?
I would have thought that 3s to turn round a request at the server would be slow by most standards. By the time you add in network delay, firewalling, proxies, and any client-side delay, the end-user will be lucky to see 5s.

Transaction processing is all about getting the request in, processing it as quickly as possible, and freeing up resources so you are ready to process the next one.

You can ask MySQL to log all queries to a file (don't know what format you get, ideally start/stop/CPU times) but a few minute's worth would be a useful sample. Process the log (using perl, naturally) and multiply the resource usage by the frequency. Whichever query comes out top of the list is the one you need to look at first. Use EXPLAIN to see what it's doing. The 80/20 rule applies here - normally 80% of the benefit can be derived from tuning 20% of the queries...

Steve

[small]"Every program can be reduced by one instruction, and every program has at least one bug. Therefore, any program can be reduced to one instruction which doesn't work." (Object::perlDesignPatterns)[/small]
 
Thanks Steve, I will do that right now. I did reduce the file size of my CGI script from 50k to 20k by moving a bunch of subroutines to another file. That seemed to help quite a bit but the server is still choking.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top