Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

insert data from another SQL server database

Status
Not open for further replies.

swetab

Programmer
May 6, 2003
49
0
0
US
Hello,
I have 2 SQL server databases.
Have added the linked server on server_A to add server_B
I want to insert the data from a storedprocedure which selects the data from server_A.pubs.dbo.titles to server_b.pubs.dbo.titles

This stored procedure is on server_A.
Its not completing the process, when i execute the SP
Can you please tell me how to use openquery to insert the data to server_B database?


thanks
 
Openquery isn't needed to do this.

All that is needed within the procedure is an insert statement.
Code:
create procedure usp_test as
insert into pubs.dbo.titles
select *
from server_b.pubs.dbo.titles
go

Denny
MCSA (2003) / MCDBA (SQL 2000) / MCTS (SQL 2005) / MCITP Database Administrator (SQL 2005)

--Anything is possible. All it takes is a little research. (Me)
[noevil]
 
Yes I have the insert statement in the SP, but when i try to run this it keeps executing and doesn't end the process.

And I just gave example of Pubs, but actually we have huge data (say 3GB) to insert into server_B database table.
An I am extracting the data from Server_A with lot of joins .

So I wanted to know if this sort of insertion is best idea for huge data or will it improve the exectuion process if I use openquery?
Please suggest.
Thanks Again.
 
The method mrdenny shows is probably the fastest.

With so much data, I would suggest you drop indexes on the target table before inserting the data, and then create the indexes after the process is finished. If you have indexes on the table, inserts will be slower.


-George

Strong and bitter words indicate a weak cause. - Fortune cookie wisdom
 
Another thing to check, and in my opinion more important.....

Are the data and log files set to autogrow on Server B? If so, by how much? If you don't have them set properly, Server B's files will have to repeatedly grow during the insert. That can cause a huge delay.

-SQLBill

Posting advice: FAQ481-4875
 
For Data file i have given automatically gro by 20% and Log files by 10%.

This process when done on same local SQL server runs in 1 hour, but when i run it across the SQL Servers it does not complete the process at all and shows executing the query.

I tired with fewer data insert too but same thing happens..

My question is why is slow and doesn't do anyhting when executing across the servers any alternates for this.

Both server have good RAM and disk space.

Please suggest
Please suggest.
 
There could be many many reasons why you are having a problem, there is not enough information to really diagnose. One thing to check is to see if you have triggers on the table you are inserting into. If so, do they have cursors or loops? You may be having a problem because you are inserting 3 million reocords to an audit table or related table one record at a time. This can take forever with that number of records.

Depending on the current database size the growth you have listed may not be enough to manage the transaction.

You may be running into some locking issues. You may have an indexing issue. Your query could be not very good. YOu may have data conflicts that need to be resolved with the existing data.

It's possible the problem is inthe network connection. Have you tested your linked server since you set it up to see if you can select data from the other server in QA?

Are there any scheduled jobs which might be causing a problem because they are running in the background?


Questions about posting. See faq183-874
Click here to help with Hurricane Relief
 
Have tested the linked server after setting up I can access all the database and records quickly.

The network is good and everyhting seems to be working fine.

have only primary kes defined on the destiantion table.

I did a simple insert on pubs it works fine.

But insert on my database is not completing the task, even for just 3000 records.

Need urgent resolution.

 
Hello,

I wanted to update that I found y its was not completeing the execution when I run the stored procdure

Because I had Transactions in the stored procedure.
Once i commented all the transaction lines its working fine now.

I am able to copy the data from source to destination server.

Can anyone justify why having Transactions in Stored procedure had issues with loading.

Thanks to all.
 
I would preseume that there was something wrong with your transaction code, di you not put ina commit statement and a way to rollback if an error was encountered?

If you are doing multiple actions in one proc you will need to be able to commit or rollback the whole transaction or you will have data integrity problems.

Questions about posting. See faq183-874
Click here to help with Hurricane Relief
 
Here is details:
I am trying to insert a single record from one SQL server (SERVER_A) to another SQL server_B with Transactions

The environment is as follows:
Server_A
Windows 2000, SQL server 2000
SERver_B
Windows 2003 enterprise edition , sql server 2000
Question:Should I enable the network DTC access on SERVER_B

Without Transactions the process is working perfect in SQL query analyser (takes less than 1 sec).

But when i give the following code with transaction in query analyser it just says "Executing the query" but does not do anyhting nor displays any error, I have waited for long hours before i cancel the execution :

------------------------------------------------
begin tran
--from server_A to server_B
insert into SERver_B.pubs.dbo.testtable
SELECT c1,c2,c3,c4 from testtable
IF (@@ERROR <> 0)
BEGIN
ROLLBACK TRAN
PRINT 'ERROR '
PRINT 'EXITING....'
END
ELSE
BEGIN
COMMIT TRAN
END

----------------------------------------------------
------------------------------------------------
begin distributed tran

insert into SERVER_B.pubs.dbo.testtable
SELECT c1,c2,c3,c4 from testtable
IF (@@ERROR <> 0)
BEGIN
ROLLBACK TRAN
PRINT 'ERROR '
PRINT 'EXITING....'
END
ELSE
BEGIN
COMMIT TRAN
END

----------------------------------------------------

Thanks a lot SQLSister
 
i would keep it all in one transaction if possible and use @@error to determine if you should go to the next step or roll back. Unless you want to insert into the table if the distributed transaction rolls back which will probably create data integrity problems. This is especially true as the most likely casue of rollbacks in this scenario is that the other server is not available at the time the transaction happens or the transaction to it times out.

Questions about posting. See faq183-874
Click here to help with Hurricane Relief
 
I tested inserting data record from one SQL server (SERVER_A) to another SQL server_B with Transactions

Both being Windows server 2000.

It works perfect with < 1 sec.

All that i had to give is "set xact_abort on"

Next on windows server 2003 cluster I will try configuring and enabling the MS DTC.

SQL Sister I didn't understand your last post can u please explain with some code.

Thanks
 
If you are doing multiple inserts, you want them in the same transaction or you will have a problem if one of them fails. Suppose you enter to a second table after the first. If the first is in one transaction then it has completed and is done, then the second runs in a different transaction and it fails and rolls back. Then you have records in table 1 which do not have corresponding records in table 2 which creates a data integrity problem.
This is espcially a problem with transactions going to another server as they will fail if the connection to the other server is lost.

So I wrap all code for multiple actions in new transaction and move through it step by step, skipping any steps after it fails and going straight to the rollback staetment. You use the @@error variable after each step to find out if it failed.

One of the beauties of this is that you can then send the error to an error table and see what was causing it to fail or you can do a select only after all the transaction have happened and been committed or send yourself some kind of message (If I'm doing preprocessing before running our import tool, after the commit statement I have it do a sprint to the screen with all the details I need to provide to the import tool such as what the default password for new users from that client is). The basic format I use is something like :
Code:
Declare @intErrorCode int

Declare @intTransactionCountOnEntry int

Declare @ProjectNumber as int

IF isnull(@intErrorCode,0) = 0 

	Begin
	
		Select @intTransactionCountOnEntry = @@TranCount
	
	     BEGIN TRANSACTION
	
	End

Select @intErrorCode = @@Error

If @intErrorCode = 0 

	Begin

--insert code here

		Select @intErrorCode = @@Error
	
	End

If @intErrorCode = 0 

	Begin

--insert code here

		Select @intErrorCode = @@Error
	
	End

If @@Trancount > @intTransactionCountOnEntry

	Begin

     	If @intErrorCode = 0 

          	COMMIT TRANSACTION
                --insert code here to send a success message to the screen or perform other select action needed or code that doesn't need to be wrapped in this particular transaction.
          Else

               ROLLBACK TRANSACTION
               --insert code here to send error to error table or to screen, use master..sysmessages table to get error description if you want
        End

SET XACT_ABORT ON will also rollback but you don't have the flexibility of performing an action besides a rollback if you need to rollback the transaction.

Questions about posting. See faq183-874
Click here to help with Hurricane Relief
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top