Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

How to monitor and compare performance of same command on two servers

Status
Not open for further replies.

yoojamu

MIS
Oct 3, 2011
2
KW
Hardware : Sun T2000 Server with Sparc 1G
OS Solaris 10

I am trying to tar and gzip oracle database file with total size of 55 gb. The sun server takes almost 5 hours to finish this process. The server is not busy with anything else when i am running this command

tar cEf - . | gzip -c > /u03/bkp/test.tar.gz

I have migrated the same database to a Red Hat Linux 4 which is running on an VM with i5 processor(2.6G). if i issue same command on this Linux box which taking only half the time of sun server.

How can i compare these servers to find out what is making sun server to take such a long time. It would be very helping if anybody can point to me what tools and what methods i can use to find out the issue.

Regards,
Yoonus
 
Look at the clock speed of the processors. There are a lot of processes that are very CPU intensive, and have MUCH better performance by just putting them on a fast CPU, regardless of architecture (SPARC vs Intel vs ???).

The "[tt]tar[/tt]" command will run faster or slower based on faster or slower drives and IO system. The "[tt]gzip[/tt]" command is very CPU intensive and will run faster or slower based on CPU clock speed.

From your post, if I'm reading it correctly, it's 1Ghz on the Sun T2000, and 2.6GHz on the Linux box. I believe that's why it runs in half the time right there.

 
Thank you Sam. You are right its because of low processor speed. I found that there is pbzip2 for parallel zipping that will utilize every core of my CPU.

I did download and installed on my server from this site
compressing with this util now takes only 75 minutes where it used to take 300 minutes with gzip.

With regards to your statement "The "gzip" command is very CPU intensive and will...." how do i know if a util is cpu intensive or io intensive
 
Just think about what it's doing. The "[tt]tar[/tt]" command just puts a bunch of files into one file, so it's mostly just doing IO. Not a lot of "thinking" to do. The "[tt]gzip[/tt]" command on the other hand has to compress the file, which requires a lot of "thinking" to decide how best to compress it, then a bunch of processing to actually do the compression.

The "[tt]time[/tt]" command will show you how much CPU a command takes. I often do this...
Code:
time gzip -9v reallybig.tar
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top