Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Copy Data using Linked Server or Bulk Copy or SSIS ??? 1

Status
Not open for further replies.

nbhariyani

Programmer
Mar 25, 2011
2
IN
Hello friends,

I have 1 task in production DB which almost take 6-12 Hours depend on parameters. Which slower down my production DB environment.

My requirement is... I want to do that process to different server or instance and finally dump my result to Production DB. To process data I required 150 tables (some of which may have millions of record).

I have 2 options :
1. Azure Server
2. Another Windows Server

Before starting process I will copy all this Tables from live server to Temp Server. After processing I will take only few thousand of rows to live DB back.

Please suggest me:
1. Which server should I use.
2. which technique should I use to copy data
(using Linked Server direct access to live db and dump data here in local tables.,
or bulk copy
or SSIS)
 
not enough information to know which one is best.
e.g. we would need to know size of each table involved (both space and row count), total size of DB, if all the tables you need are on filegroups that could easily be restored independently.

options for copying data depending on a few things would be
backup prd (with compression, 4 or more datafiles) -> restore to new instance - depending on size of db this may be the fastest way of copying data
SSIS/Bulk copy (both behave almost the same way)
linked server (with dest db set to bulk logging and using tablocks)
linked server

but before doing any of the above I would try and find out what steps of that task are causing it to take so long and see if it could be improved - and for this we would need to know exactly what it is doing.
e.g.
does part of it work based on cursors - if so can it be changed to use set based.
or does it involve a query with a significant number of joins, some of which may be move out of the main query into temp tables ?
or can it be split into individual tasks where some run in parallel and a final one works based on the previous tasks

I've seen to many cases of queries using what appears to be a simple outer apply to get the most recent value of a field related to a particular key on the sub table go down from 15m to 20s just by doing a aggregation instead of a top 1 row

Regards

Frederico Fonseca
SysSoft Integrated Ltd

FAQ219-2884
FAQ181-2886
 
yes from my experience if SQL task runs 6 hour it is something written not right way and needs to fixed (SQL) as fredericofonseca mention look close on your sql statements analyze which part is slow and rewrite...
 
Thank you friends, for your valuable reply, I will look close on all queries. But as of now as fast solution I have to go for separate server with BCP In Out technique to Copy Data.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top