Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

What is fastest? If Then / If Then Else / Case

Status
Not open for further replies.

benvegiard

Programmer
May 15, 2003
63
US
Can anyone point me to an article or have any backing information on what the fastest control structure is in SQL Server?

Here's my scenario, I am working on some performance tuning. A very common stored proc (fired about 50 - 100 times a minute) have a little over 200 if/thens to decide what data to bring back (it's a centralized GetPickList type of thing where they pass it what list they want...)

Since I can't break that out into seperate procs without much effort being put in by the developers, I wanted to see if taking the time to convert to If/Then/Else or making it a case would gain us anything.

Thanks,
Ben
 
Typically using lots of if blocks within a stored proc is frowned upon as it will prevent you from caching the execution plans.

If you can convert to a single select statement with case statements then you can avoid this by being able to cache the execution plan.

You would get the same result by breaking it apart into several stored procs.

Denny
MCSA (2003) / MCDBA (SQL 2000) / MCTS (SQL 2005) / MCITP Database Administrator (SQL 2005)

--Anything is possible. All it takes is a little research. (Me)
[noevil]
 
I have actually gotten mixed results when I tried to benchmark the If/Else verses the Case.

Honestly, it depends on what you're doing. Sometimes CASE is better. Sometimes IF/ELSE is better, even if it's nested.

To find out which one works better for your particular query, write up the query with one and then go to "Display Estimated Execution Plan". Look for any Scans (Bad Things) or parts that indicate a high percentage of processing time. Scans can be changed to Seeks with Index Hints, BTW. Once you've checked the first version, write it up as a second version and check the Estimated Execution Plan for the other way. Pick the one which works better against your database.

Be aware, as I said above, that one will NOT ALWAYS work better than the other. It depends on the tables & data involved.



Catadmin - MCDBA, MCSA
"No, no. Yes. No, I tried that. Yes, both ways. No, I don't know. No again. Are there any more questions?"
-- Xena, "Been There, Done That"
 
And rememebr when you are performance tuning, that developer effort does not trump performance. If waht needs to be done to fix the problem is that the developers develop separate sps for different tasks, then that is what needs to be done. One of the major causee I've seen of poor performance of sps is developers programming for ease of development and not performance.

Questions about posting. See faq183-874
 
I couldn't have said it better.

Denny
MCSA (2003) / MCDBA (SQL 2000) / MCTS (SQL 2005) / MCITP Database Administrator (SQL 2005)

--Anything is possible. All it takes is a little research. (Me)
[noevil]
 
One thing I have done is pre-processed data-driven code. That is, I wanted the power and flexibility of using data-driven principles in my program, but I also wanted the speed of precompiled "hardcoding." So I did both, by using preprocessing. I wrote code that read data from tables and wrote corresponding code in various triggers. When I wanted to make a change, I updated the table, ran the code, and got what I needed.

Is there any possibility of something similar for your developers? There's no reason they couldn't work out something like that with separate stored procedures for different parts of the logic, in order to allow each one to be compiled and get an optimized query plan for different data sets. You just have to come up with a naming convention and carefully analyze the logic of the proc to see if it is susceptible to some ordered table-represented layout.

Admittedly, the code was complex, as at times I had to construct dynamic SQL which queried tables to construct yet more, nested dynamic SQL. But when I was all done, I was able to make sweeping changes incredibly easily. And modifications weren't so hard when I started using tables for my dynamic SQL too.

That is, instead of putting my trigger prototype stubs in code, with lots of nested nested quotes, I put them in a lookup table which I pulled into a variable and used substitution like

Code:
INSERT [{TableName}]

--and

SELECT @TrigText = Replace(QueryText, '{TableName}', @TableName) FROM AutoQueries WHERE QueryName = 'AutoUpdateTrigger'
Erik

PS Yes triggers should be avoided where possible. But sometimes business pressures don't allow for that...
 
It's a great idea, Erik, but the problem with some shops is that not only do they have too many developers working on projects, but most of what they're working on is new functionality, not correcting old stuff. When the developers are making schema changes and additions, it's hard to "pre-code" for that.

Back to the topic at hand, though. Ben, another tool you can use in working on this proc is Profiler. You can actually filter it out by login id, so while you're running it on a DEV box, you can see only your processes. This will help you debug it.

Also, is there a reason why you can't break it up yourself instead of waiting for the developers? Find out how they call it in the client and all the variables they pass. Have the initial proc retain the same name and store all those variables and then have it call all the child procs as appropriate. This way, nothing breaks in the client and you've improved performance.



Catadmin - MCDBA, MCSA
"No, no. Yes. No, I tried that. Yes, both ways. No, I don't know. No again. Are there any more questions?"
-- Xena, "Been There, Done That"
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top