Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

.NET web service vs. T-SQL as RS datasources 1

Status
Not open for further replies.

jcaulder

Programmer
Apr 22, 2002
241
US
First, let me say I am not a report developer, rather a .NET developer. Intermittently I am brought in to assist with queries and creating complex logic within T-SQL. From the beginning I have thought implementing extremely complex logic within T-SQL stored procedures was a bad approach. I had suggested that to solve our complex queries and data manipulation we would be better suited to use a high level language like vb.NET.

I shot this idea out to our group with warm reception and now want to confirm this is a good idea or that I am not missing some pitfalls with this approach.

The idea, and it isn't a new one obviously would be to replace all of our complex T-SQL stored procedures with .NET web service calls that return XML data.

The advantages of .NET over T-SQL seem to me to be:
1)Quicker delivery time using high level language
2)Much better debugging tools/intellisense
3)Rich classes to assist with string, data manipulation
4)Encapsulation of logic via functions, classes, etc.
5)Much easier to maintain and understand logic
6)Off-load the data processing and manipulation to a cheaper
web server rather than SQL Server taxed with this. SQL just runs simple queries.
7)We have existing .NET code within desktop applications that already build the datasets consumed by Crystal. We could take this existing logic, export the dataset to XML and we have a working Reporting Services version without needing to port the entire logic to T-SQL.
8)Others?

The disadvantages might be:
1)Potentially dynamic sql is passed from .NET to SQL Server so it isn't precompiled with execution plan like a stored procedure would be. However, this seems to be outweighed by the fact that SQL Server is no longer performing the complex processing. Further, many .NET sql calls could still be to parameterized stored procedures.
2)Requires knowing .NET to do report writing and development potentially unless this is separated into two roles. I am frequently brought in on complex tasks anyway so not sure this is too much downside. Furthermore, our report developer wants to learn .NET so this allows him to grow in that area.
3)Others?

Anybody have any suggestions on whether this is a good or bad approach. If so, why? Perhaps there is some other approach that is better.

Just seems terribly difficult to implement many of our queries/processing in T-SQL and even once done, maintenance is a nightmare because of the hoops we jump through usually to get it done. I would 10 times rather write .NET code to accomplish a task rather than T-SQL.

TIA!

J
 
I've never used .Net webservices as a source for SSRS reports. However, I have developed extensively in both .Net and T-SQL. I think you will find that some "queries" will run faster in T-SQL and some in .Net, based on your description of running a simple query in the .Net code against SQL while doing complex manipulations in .Net. By the way, are your SQL datasources optimized with indexes and the like?

Anyways, I would also recommend you strongly consider basing your some of your reports off of a datamart/data warehouse type of schema. Your complex business logic would be implemented in the ETL from your sources to the data warehouse schema. And in this scenario, most of your reports would be simple T-SQL queries and run quickly due to the nature of your new database schema.


 
Thanks for the response.

Let me give a simple example of the challenge as I see it.

I need to get some data and want to loop through each row of the result set performing data manipulation, formatting, or further population of the row.

To do this in T-SQL(without using a cursor due to their evilness):
1)Declare a table variable, typing in every column and datatype, inlcuding an integer column to use for looping
2)Type out the query to be used to populate the table variable, run it inserting into the table variable
3)declare all necessary looping control variables
4)query the variable table to initiate the looping variables
5)start the looping with a WHILE loop
6)perform the row processing(in and of itself complex but forget that for now)
7)update the row in the table variable with the computed data from step 6 using the looping variable id as the WHERE clause
8)get the next minimum id and set the looping variable to it
9)load up the variables with the data for this row
10)loop until the id is NULL
11)select data from the table variable to return to the report

None of the preceeding coding offered any intellisense and any errors along the way are difficult to decipher(for me anyway) and many times reference a line number that has nothing to do with the true error. Existing functions are limited(i.e. string, date, etc. manipulation). I could go on...

In .NET:
1)Create the stored procedure with the query
2)Create a strongly type dataset, datatable, dataadapter automagically by referencing the sp from step 1
3)Now for processing, fill the dataset/datatable.
4)Use a construct like "for each row in ds.dt.Rows" to loop through the data
5)Process the data(everything is strongly typed using row.Field1 = newValue so typos are impossible)
6)Return the xml of the dataset/datatable

Rich functions pre-exist for most needs with intellisense, real time error checking, verbose and easy to understand errors, error handling with try..catch, strongly typed classes, etc., etc.

Per your point, it isn't performance issues I'm concerned with. It's the total cost of creating and maintaining the logic. Some of these stored procedures run just fine, they are optimized as much as possible and do have necessary indexes to support them where appropriate. They are just hugely complex with many nested CASE statements and looping or else written in dynamic sql when all else failed.

Also, considering data warehouse/datamart solutions, many of these reports are 'real time' in nature. They are ran frequently during the day to compare data and analyze real time operations. They are not aggregate type reports but rather detail type.

I know it's difficult to understand perhaps without example. A typical report may show an employee's production activity for today, his attendance codes(if any), his total payroll broken up into specific categories(production, bonus, vacation, sick, subsidy, etc.), and perhaps some year to date data. Quite a bit of work goes into showing this data the way it is requested. This is a simple example anyway hopefully to prove the point.

Either way we can present it the way they want it, it's just a question of the most efficient way to get there, .NET web service XML or T-SQL.

Think I need to try one out to see if it works like I expect or if I run into problems along the way. Just didn't want to spend too much time if I'm looking at this the wrong way.

Thanks again!

J

 
I think the best advice I could give you is that is just depends on your situation. Looping through rows and processing in .Net will probably be a lot faster than looping through rows and processing in SQL.

The .Net route may be easier to maintain as far as the development environment is concerned, especially for a .Net developer. A SQL developer might yet argue that it makes life easier to be able to quickly view and edit code (going through dev/test/production of course) as opposed to opening up an assembly, compiling and redeploying. Plus, a SQL developer with an Intellisense plugin like Red Gate might argue he or she can do intellisense similar to a strongly typed dataset as well.

I understand your concerns about realtime data. I've never developed a real time data warehouse, however, low latency setups are a different animal and are quite common.

Since performance really isn't the issue here, I would suggest that this is a question more for you and your team as far as the skillsets available to develop and maintain whichever method (or mixture) that you choose.
 
I would tend to agree with all that.

I am in the process of creating a service and supporting logic now to see how it goes.

I think this will prove to be a good solution(mix) for us but time will tell. There may be some limitations I'm not aware of that would prevent large result sets or something similar. Be a quick education going through the process anyway.

Thanks for the feedback!

J
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top