Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Using EXEC within Stored Procedure

Status
Not open for further replies.

rickj65

Programmer
Jun 5, 2002
79
US
I have a stored procedure that essentially takes data from a denormalized datawarehouse table and plugs certain data elements into a normalized table.

This requires running 84 INSERT statements for each denormalized record.

Rather than manually creating 84 insert statements in my sp, I'm running a loop to build the SQL string and then executing the SQL string using EXEC(@strSQL). Unfortunately this sp takes over 2 hours to run, although it is designed as an overnight process.

Question: Will I get better performance if I hardcode the INSERT statements rather than dynamically using the EXEC statement? If I used hardcoded SQL statements, I'm assuming the query plan is saved and optimized. Is the same true if I run a dynamic SQL statement using EXEC?

TIA for your input,

Rick

PS - I will test this theory eventually, I just don't want to start writing out 84 INSERT statements yet...especially if the performance benefit is going to be negligible.


 
My personal experience is that Dynamic SQL is less efficient than hard-coded SQl in a stored procedure because the statements can't be optimized in advance.

Instead of writing out all 84 statements why not save the result of your build routine into a table then cut and paste them into a stored procedure. That at least will save the time of writing them by hand.
 
You will gain some performance in the areas you have mentioned. You will be saving the on the command creation as well as the parsing and creation of the associated execution plan.

What you also have to take into consideration is that you are doing multiple inserts. This action in of itself will cause your command to be re-parsed and a new execution plan created based on an internal mechanism of SQL Server. Not sure exactly what that threashold is, but it does exist.

So if you can cut-n-paste your insert statments, your SP will run faster that having all the statments built dynamically.

Thanks

J. Kusch
 
I think it really depends on how big your denormalized table is. If it is huge maybe some Indexes should be made according to the criteria being generated from your dynamic query.

Is your EXEC(@strSQL) making numerous transactions. If that is the case you may want to break it up into
Exec(@strSQL1)
Exec(@strSQL2)
Exec(@strSQL3)

This way your transaction log can shrink before it gets too big. I have had issues where my transaction log has reached over 25 gigs, which slows performance massively causing hours of extra run time.
When I partitioned the query by a unique indexed key within a loop, my originating overnight query which updated over 1000 million records took an hour.

Let me know if that helps...

Bygs
 
Only one tip:
If you decide to write that inserts and you have query analyzer,
just instead of EXEC(@strSQL) use
PRINT
PRINT @strSQL

Than select all from messages tab of QA and paste it as new query

Zhavic

---------------------------------------------------------------
In the 1960s you needed the power of two Comodore64s to get a rocket to the moon. Now you need a machine which is a vast number of times more powerful just to run the most popular GUI.
 
Thank you everyone for your input. Even though I was testing the procedure by PRINTing the resulting @strSQL statement, it hadn't even occurred to me that I could copy those statements into the SP as a shortcut. Can you say brain fart?

I guess my biggest issue is one of ego - my procedure is really (IMO) elegantly written (pat on my own back). The hardcoding just seemed amateurish even though I'm sure it will perform better. I know that's wrong thinking, but you guys know the deal with prideful programmers I'm sure!

Bygbobbo - My denormalized table is a bit more than 5000 rows which translates to approx. 450,000 INSERTs. My EXEC(@strSQL) statement executes only one INSERT query and then proceeds with the loop to create the next @strSQL statement. I'm not grouping the INSERTs as a batch with one @strSQL variable. So I don't think the transaction log will be a problem...but I could be wrong.

Thanks again for the input,
Rick
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top