We've basically built a rule engine here. Rules are built where you specify criteria that will be applied when the application is run for "approval". What it boils down to is I have about 250-350 unique TSQL statements (they're basic TSQL statements, no subqueries and just 2-3 joins max) that are being assembled in C#. I need to run all these TSQL statements in SQL Server. When I feed them into SQL Server one by one it takes about 4-5 seconds to run all the requests which is not acceptable. BTW, I'm using a single connection that I'm keeping open to avoid having to create a new connection every time. I thought about running batches, where I would concatenate an entire group's worth of TSQL statements together, but I believe SQL's varchar limitation of 4000 maximum characters is going to kill that idea.
Other than multithreading, which could get a little hairy on a web server, I'm starting to run out of ideas. It just seems like 250-350 basic queries should be able to execute faster than 4-5 seconds. Unfortunately, since we're using a "rules approach" I don't believe we can create a stored procedure / parameter approach that can be precompiled. Any ideas?
Other than multithreading, which could get a little hairy on a web server, I'm starting to run out of ideas. It just seems like 250-350 basic queries should be able to execute faster than 4-5 seconds. Unfortunately, since we're using a "rules approach" I don't believe we can create a stored procedure / parameter approach that can be precompiled. Any ideas?