The purpose of this white paper is to describe how to use the Service Broker include in Microsoft SQL Server 2005 to create a real time reporting solution based on your production OLTP database. This paper will guide you step by step through the process.
For the purposes of this white paper it is assumed that the production database called DB1 exists on SQL01 and that the reporting database called DB2 exists on SQL02. Both servers are running Microsoft Windows Server 2003 and Microsoft SQL Server 2005. No specific service pack is required to use these features.
The goal of this solution are to provide a reporting database which is updated in real time, or near real time from the production OLTP database with no coding changes to the production code, or front end application and add minimal additional load to the production system. We also require delivery confirmation of the data change as well as ordered delivery. Based on all these requirements the Service Broker is an excellent method of delivery. Triggers on the required tables will make an excellent data collection method.
The data flow will work as follows. A user will make a data change in a table (INSERT, UPDATE or DELETE). That change will be captured via a trigger, the effected record or records will be bundled into an XML document and sent to a Service Broker Queue. The Service Broker will then transmit the message to the destination database on the Reporting Server where the XML data will be turned back into relational data and the required fields will be updated in the Reporting database.
The first thing that needs to be done is that both DB1 and DB2 need to have the Service Broker enabled. This is done via the ALTER DATABASE command.
Code:
ALTER DATABASE DB1
ENABLE_BROKER;
We can now create and use Server Broker objects. First we will create the objects on DB1. We will be creating an end point, two message types, one Contract, two queues, one service and a route. In order to keep the naming of the objects easy to keep straight we will be using a URL naming standard. While the name will appear to control routing of the messages between servers it does not. The URL naming standard is used so that multiple objects can be created on multiple servers and keeping the objects lined up becomes easier. Service Broker object names are case sensitive regardless of the database collation. This is because the Service Broker does a binary compare of object names not a text string compare. All objects will be created using the least strict method of security. For sensitive customer data all the security should be increased to the level required by your company.
First we will create the end point. The end points can be created on any available TCP port. They do not need to have the same name or use the same port number, however long term management will be easier if they are the same. The end points should be created on both SQL01 and SQL02 before anything else is done.
[code Create Endpoints]
CREATE ENDPOINT ReportingFeed_EndPoint
STATE=STARTED
AS TCP(LISTERER_PORT=1500, LISTENER_IP=ALL)
FOR SERVICE_BROKER (AUTHENTICATION=WINDOWS)
GO
[/code]
Next we will create the message types.
[code Message Types]
CREATE MESSAGE TYPE [tcp://SQL01/DB1/ReportingFeed_MT] VALIDATION=NONE
GO
CREATE MESSAGE TYPE [tcp://SQL02/DB2/ReportingFeed_MT] VALIDATION=NONE
GO
[/code]
Next we will create the contract between the message types.
[code Contract]
CREATE CONTRACT [tcp://SQL01/DB1/ReportingFeed_Contract]
(
[tcp://SQL01/DB1/ReportingFeed_MT] SENT BY ANY,
[tcp://SQL02/DB2/ReportingFeed_MT] SENT BY ANY
)
GO
[/code]
Next we will create the message queues needed for the delivery.
[code Queues]
CREATE QUEUE [tcp://SQL01/DB1/ReportingFeed_Queue] ON PRIMARY
GO
CREATE QUEUE [tcp://SQL02/DB2/ReportingFeed_Queue] ON PRIMARY
GO
[/code]
Next we will create the actual services on the Queues, using the contract that we created.
[code Services]
CREATE SERVICE [tcp://SQL01/DB1/ReportingFeed_Service]
ON QUEUE [tcp://SQL01/DB1/ReportingFeed_Queue]
(
[tcp://SQL01/DB1/ReportingFeed_Contract]
)
GO
CREATE SERVICE [tcp://SQL02/DB2/ReportingFeed_Service]
ON QUEUE [tcp://SQL02/DB2/ReportingFeed_Queue]
(
[tcp://SQL01/DB1/ReportingFeed_Contract]
)
GO
[/code]
Finely we will create the route from SQL01 to SQL02. When creating the route from SQL01 to SQL02 it is recommend to include the BROKER_INSTANCE option as shown below. The value for this setting should be gotten from the sys.databases.service_broker_guid value on the SQL02 server.
[code Route]
CREATE ROUTE [tcp://SQL01/DB1/ReportingFeed_Route]
WITH SERVICE_NAME = ætcp://SQL02/DB2/ReportingFeed_Service],
BROKER_INSTANCE = NÆ00000000-0000-0000-0000-000000000000Æ,
ADDRESS = NÆTCP://SQL02:1500Æ
GO
[/code]
Configuration of the service broker objects on SQL01 is now complete. You will now need to go to SQL02 and create the required objects. Most of the objects will be the same as you see on SQL01. As the end point has been created that object is not listed twice.
[code Message Types]
CREATE MESSAGE TYPE [tcp://SQL01/DB1/ReportingFeed_MT] VALIDATION=NONE
GO
CREATE MESSAGE TYPE [tcp://SQL02/DB2/ReportingFeed_MT] VALIDATION=NONE
GO
[/code]
Next we will create the contract between the message types.
[code Contracts]
CREATE CONTRACT [tcp://SQL02/DB2/ReportingFeed_Contract]
(
[tcp://SQL01/DB1/ReportingFeed_MT] SENT BY ANY,
[tcp://SQL02/DB2/ReportingFeed_MT] SENT BY ANY
)
GO
[/code]
Next we will create the message queues needed for the delivery.
[code Queues]
CREATE QUEUE [tcp://SQL01/DB1/ReportingFeed_Queue] ON PRIMARY
GO
CREATE QUEUE [tcp://SQL02/DB2/ReportingFeed_Queue] ON PRIMARY
GO
[/code]
Next we will create the actual services on the Queues, using the contract that we created.
[code Services]
CREATE SERVICE [tcp://SQL01/DB1/ReportingFeed_Service]
ON QUEUE [tcp://SQL01/DB1/ReportingFeed_Queue]
(
[tcp://SQL02/DB2/ReportingFeed_Contract]
)
GO
CREATE SERVICE [tcp://SQL02/DB2/ReportingFeed_Service]
ON QUEUE [tcp://SQL02/DB2/ReportingFeed_Queue]
(
[tcp://SQL02/DB2/ReportingFeed_Contract]
)
GO
[/code]
Because of the security setup we are using we need to grant the PUBLIC role the right to use the service.
[code Rights]
GRANT CONTROL ON SERVICE :: [tcp://SQL02/DB2/ReportingFeed_Queue] TO PUBLIC
GO
[/code]
Finely we will create the route from SQL02 to SQL01. When creating the route from SQL02 to SQL01 it is recommend to include the BROKER_INSTANCE option as shown below. The value for this setting should be gotten from the sys.databases.service_broker_guid value on the SQL01 server.
[code ROUTE]
CREATE ROUTE [tcp://SQL02/DB2/ReportingFeed_Route]
WITH SERVICE_NAME = ætcp://SQL01/DB1/ReportingFeed_Service],
BROKER_INSTANCE = NÆ00000000-0000-0000-0000-000000000000Æ,
ADDRESS = NÆTCP://SQL01:1500Æ
GO
[/code]
We now need to go back to SQL01 and create the stored procedure which the triggers will use to actually send the message to the Service Broker.
[code ssb_SendReportingMessage]
CREATE PROCEDURE ssb_SendReportingMessage
@message_body XML
AS
BEGIN
DECLARE @Handle_ID uniqueidentifier
BEGIN DIALOG CONVERSATION @Handle_ID
FROM SERVICE [tcp://SQL01/DB1/ReportingFeed_Service]
TO SERVICE [tcp://SQL02/DB2/ReportingFeed_Service]
ON CONTRACT [tcp://SQL02/DB2/ReportingFeed_Contract]
WITH ENCRYPTION=OFF;
SEND ON CONVERSATION @Handle_ID
MESSAGE TYPE [tcp://SQL02/DB2/ReportingFeed_MT]
(@message_body);
END
GO
[/code]
This can be tested by executing the procedure and passing any value. If you then query the sys.conversation_endpoints dynamic management view you will see the message has been processed. By going to SQL02.DB2 and querying the [tcp://SQL02/DB2/ReportingFeed_Queue] Service Broker Queue you will be able to see the message.
Now create the triggers on the tables which need to be transferred. The Table T1 will be used for this example. When creating the triggers on the tables be sure to change the Trigger name, the object the trigger is being created on, as well as the first value of each select statement within the trigger. This value is used to tell the procedures on the SQL02 which table the data is coming from. The XML tags should not be changed unless you also modify the tags on SQL02 to match or the parser will not work. Because we are using the * wild card in the select statements column additions and drops from the table do not require modification of the triggers.
[code trg_T1_IUD_ReportingFeed]
CREATE TRIGGER trg_T1_IUD_ReportingFeed ON T1
FOR INSERT, UPDATE, DELETE
AS
BEGIN
DECLARE @xmlData as XML
IF NOT EXISTS (SELECT * FROM inserted) -- The trigger is being fired for a delete and there for deleted need to be processed on the reporting database.
BEGIN
SET @xmlData = (SELECT æT1Æ TableName, æDÆ CommandType, *
FROM deleted
FOR XML RAW (æTableÆ), ROOT(æROOTÆ))
END
ELSE -- An insert or update was sent, so we need to capture only the new data. The old data will already exist on the reporting system.
BEGIN
SET @xmlData = (SELECT æT1Æ, TableName, æIÆ CommandType, *
FROM inserted
FOR XML RAW (æTableÆ), ROOT(æROOTÆ)
END
END
GO
[/code]
Once you create these triggers and users being updating the tables you will begin to see messages arrive in the [tcp://SQL02/DB2/ReportingFeed_Queue] Service Broker Queue on SQL02.
On SQL02 we now need to create a procedure to identify the name of the table that we are processing.
[code asp_IsolateTable]
CREATE PROCEDURE asp_IsolateTable
@message_body XML,
@table_name sysname OUTPUT,
@CommandType nchar(1) OUTPUT
AS
BEGIN
DECLARE @hdoc int
exec sp_xml_preparedocument @hdoc OUTPUT, @message_body
SELECT DISTINCT @table_name = TableName, @CommandType = CommandType
FROM OPENXML(@hdoc, NÆ/ROOT/TableÆ)
WITH (TableName sysname æ@TableNameÆ,
CommandType nchar(1) æ@CommandTypeÆ) a
Exec sp_xml_removedocument @hdoc
END
GO
[/code]
We now need to create the procedure to actually parse and process the table. It is recommended to have one procedure per table to easy management of the code.
[code asp_ProcessTable_T1]
CREATE PROCEDURE asp_ProcessTable_T1
@message_body XML,
@CommandType nchar(1)
AS
BEGIN
DECLARE @hdoc int
exec sp_xml_preparedocument @hdoc OUTPUT, @message_body
IF @CommandType = æDÆ
BEGIN
/*Your code to either null the values or delete the record goes here. If this procedure is for the base table (ie. The parent table in the OLTP database then you should delete the record. If not then you should null the fields which come from this table.*/
DELETE FROM T1
WHERE T1.Column1 IN (SELECT Col1 FROM OPENXML(@hdoc, NÆ/ROOT/TableÆ)
WITH (Col1 int æ@Col1) a)
END
IF @CommandType = æIÆ
BEGIN
/*Your code should first update and records which exist in the table, then insert any records which do not exist.*/
UPDATE T1
SET T1.Column4 = Col8,
T1.Column9 = Col2
FROM OPENXML(@hdoc, NÆ/ROOT/TableÆ)
WITH (Col1 int æ@Col1Æ,
Col8 int æ@Col8Æ,
Col2 int æ@Col2Æ) a
WHERE T1.Column1 = Col1
INSERT INTO T1
(Column1, Column4, Column9)
SELECT Col1, Col8, Col2
FROM OPENXML(@hdoc, NÆ/ROOT/TableÆ)
WITH (Col1 int æ@Col1Æ,
Col8 int æ@Col8Æ,
Col2 int æ@Col2Æ) a
WHERE NOT EXISTS (SELECT Column1 FROM T1 WHERE Column1 = Col1)
END
exec sp_xml_removedocument @hdoc
END
GO
[/code]
After these procedures are all created we now need to create the procedure which will actually process the queue, identify the source table from the XML and call the correct procedure to load the data.
[code asp_ProcessQueue]
CREATE PROCEDURE asp_ProcessQueue AS
BEGIN
DECLARE @message_body as XML
DECLARE @Handle_ID as uniqueidentifier
DECLARE @table_name as sysname
DECLARE @CommandType as nchar(1)
WHILE 1=1 /*We process this within a never ending loop so that we can drain the queue on each run instead of having to launch the procedure one for each message.*/
BEGIN
WAITFOR (
RECEIVE TOP (1) @Handle_ID = conversation_handle,
@message_body=cast(message_body as xml)
From [tcp://SQL02/DB2/ReportingFeed_Queue]
), TIMEOUT 1000 /*If no message is received then we wait for 1 second. If no message arrives in that time continue to the next command which will break us out of the procedure so that we are not running the procedure for no reason wasting CPU and memory resources.*/
IF (@@ROWCOUNT = 0)
BREAK
END CONVERSATION @Handle_ID /*By ending the conversation we are telling SQL01 that we have removed the message from the queue and that it no longer needs to remember that the message exists.*/
IF @message_body IS NOT NULL /*Checking just in case. We donÆt want to bother trying to process a message if it is null.*/
BEGIN
exec asp_IsolateTable @message_body=@message_body,
@table_name = @table_name OUTPUT,
@CommandType = @CommandType OUTPUT
IF @table_name IS NULL /*If there is no table name we need to notify the DBAs and give them the XML so that they can identify the problem and correct it.*/
BEGIN
Exec sp_send_dbmail @profile_name=ÆYour DB Mail ProfileÆ,
@recipients=ÆYourDBAs@yourcompany.comÆ,
@subject=ÆInvalid XML receivedÆ,
@message=@message_body
BREAK
END
SET @table_name = ltrim(rtrim(@table_name)) /*Clean up, just in case the XML pads with white space.*/
If @table_name = æT1Æ
Exec asp_ProcessTable_T1 @message_body=@message_body, @CommandType=@CommandType
/*Repeat for each Table you are processing.*/
END
END
GO
[/code]
We now need to modify the Service Broker Queue [tcp://SQL02/DB2/ReportingFeed_Queue] so that it fires the stored procedure asp_ProcessQueue automatically whenever a message arrives. We are activating ACTIVATION, telling the queue what procedure to execute, and the maximum number of simultaneous copies of the stored procedure are allowed to run at any one time.
[code Alter Queue]
ALTER QUEUE [tcp://SQL02/DB2/ReportingFeed_Queue]
WITH ACTIVATION
(STATUS=ON, PROCEDURE_NAME = dbo.asp_ProcessQueue, MAX_QUEUE_READERS=1, EXECUTE AS OWNER)
GO
[/code]
As soon as you alter the queue the system will begin processing the messages currently in the queue. You should now see the data being loaded into the tables.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.