Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

What's faster?

Status
Not open for further replies.

clegg

Programmer
Jan 25, 2001
98
GB
I have quite a few SP's that need changing to accomodate exchange rate calculations. These SP's are run possibly 1000's of times each day so they need to be as fast as possible. Basically data comes from several shops up to the HQ and the data gets inserted.

Now, 1 of those stores is in a foreign country but the HQ would like all the data in the same home currency, so when the data from that specific store comes up it needs converting. I've got no problem with this but this conversion would be the exception rather than the rule and i'm just wondering what would be fastest:

1. Select the exchange rate - if it's NULL (therefore no conversion required) set it to zero.
2 In the insert do like the following (there will be more than 1 column that need changing BTW):

INSERT INTO

([Column1])
VALUES
(CASE WHEN @ExchangeRate = 0 THEN @SalesTax ELSE (@SalesTax / @ExchangeRate) END)

OR

1. Select the exchange rate - if it's NULL (therefore no conversion required) set it to 1.
2 In the insert do like the following (there will be more than 1 column that need changing BTW):

INSERT INTO

([Column1])
VALUES
((@SalesTax / @ExchangeRate))

What's quicker for SQL to evaluate given that there are 5 shops sending data and only 1 that will need converting?

TIA
Clegg
 
try running both versions with the execution plan on, it should give you the expected percentages for time taken for each version.

I'd guess the latter will be faster based on totall guess work...

--------------------
Procrastinate Now!
 
I suspect that difference in time between the 2 calculations will be roughly the same. If you are experiencing performance problems, it is probably not because of this calculation. I would suggest you look elsewhere for your performance problems.

-George

"the screen with the little boxes in the window." - Moron
 
Thanks for the replies.

gmmastros - I'm not experiencing any performance problems as yet - the question was asked so as to avoid them in the future.

crowley16 - I've run some tests and the results are identical so i'll go with the easiest route I think - the one with the least typing!!
 
Execution plans do not show differences in the time or cost of expression evaluation. They are good for comparing different data access plans, index usage, and overall query strategy, but they are completely useless for accounting for the CPU time involved in calculating.

clegg,

The queries you gave as examples are all using variables for their values, not columns. So the time taken to evaluate the expressions will be minuscule and comparable. You will probably find that declaring extra variables and having an extra statement to set them will cost more.

On the other hand, if you use the same expression many times, it can be better to declare and set first, because you're "amortizing" the cost of the declarations over more than one expression.

You just have to test it and find out. You test it by using Query Profiler, running things hundreds of times, alternating methods, and aggregate the results. Make sure that the thing you're measuring is long enough to make the measurement mean something. The CPU time has a resolution of something like 13 or 17 ms. So processes that run faster than that aren't even going to show up. If one takes 7ms and another takes 1ms, they could both always show up as 0ms which makes it hard to compare, despite the fact that one is 7 times faster than the other.

Look:

Code:
CREATE FUNCTION ComplexCalculation (@a int)
RETURNS int
AS
BEGIN
	RETURN
		((Power(@a, 2) / 2 + 4) * 7 - 9) * 3
		- ((Power(@a, 2) / 2 + 4) * 7 - 9) * 2
		+ ((Power(@a, 2) / 2 + 4) * 7 - 9) * 5
		- ((Power(@a, 2) / 2 + 4) * 7 - 9) * 4
END
GO
CREATE FUNCTION SimpleCalculation (@a int)
RETURNS int
AS
BEGIN
	RETURN @a * 2
END
GO
CREATE TABLE #TestData (Val int)

DECLARE @i int
SET @i = 1
WHILE @i < 1000 BEGIN
	INSERT #TestData SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3 UNION ALL SELECT 4 UNION ALL SELECT 5 UNION ALL SELECT 6 UNION ALL SELECT 7 UNION ALL SELECT 8 UNION ALL SELECT 9 UNION ALL SELECT 10
	SET @i = @i + 1
END
GO

-- turn on Show Execution Plan here and run the next two lines.
-- Then turn off Show Execution Plan, and go run Query Profiler, capture event SQL:StmtCompleted.
SELECT Val, Calc = dbo.ComplexCalculation(Val) FROM #TestData
SELECT Val, Calc = dbo.SimpleCalculation(Val) FROM #TestData

-- Remember to turn off Profiler when you're done.
GO

DROP FUNCTION ComplexCalculation
DROP FUNCTION SimpleCalculation
DROP TABLE #TestData
The execution plan for those two queries is identical, same cost, same method (table scan). And the results from Query Profiler?

[tt] CPU Reads Duration
Complex 156 23 366
Simple 47 23 440[/tt]

Query plans are only ONE of the tools you should be using to determine the performance of your queries. Especially if a function is involved, but the principle holds true even for complex expressions. Take the expressions out of the functions above and put them inline in the queries... same execution plans. There's an improvement from not having to call the function, but the complex expression is always worse in CPU time than the simple one.

[COLOR=black #d0d0d0]When I walk, I sometimes bump into things. I am closing my eyes so that the room will be empty.[/color]
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top