Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Should i use DTS??

Status
Not open for further replies.

Modica82

Technical User
Jan 31, 2003
410
GB
Hi all,

I have a immenent project comming up soon which will see me transport data from one FileMaker Database, to the SQL Server. I have severeal distinct Steps:

1) Import CSV Into Temporary Table
2) Identify Two Sets Of Data -
-------- Update(Data That Has Already Inserted and just needs to be updated) - Inserted
-------- Insert(Data that has not been added to the database)

3) Do Insert
4) Do Update
5) Do Compare - Here i have to go through the set of records updated and compare if the records where updated and what columns where updated
6) Email that the process has been successful
7) Email List Of Updated Fields For Each Updated Client
8)Output a CSV to be sent back

I really have no idea where to start with this? Any help would be greatly appreciated. Even some links to some online tutorials etc would be amazing.

Regards,

Rob


 
DTS seems to be a good choice to handle your requirements.
------------------

-------------------------
John Herman is available for short and long term data warehousing consulting and contracts.
 
Dts seems like an ideal tool for your requirements. Although when you start besure to reverse steps 3 and 4. Do your updates before your inserts. This allows you to use left outer joins and not insert records that will be updated.

"Shoot Me! Shoot Me NOW!!!"
- Daffy Duck
 
I wouldn't - dts tends to be a bit of an overkill for text file manipulation.
Would probably be easier (and faster) via a stored proc using bulkinsert, bcp and xp_sendmail.

Have a look at

Which imports and archives all files that arrive in a directory.



======================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
 
Thank you all for your responses.

Thanks Nigel for the link, i will check it out :)

Rob
 
I'm not disagreeing with you nigel I think the Bulk insert SP and xp sendmail are the objects you would use for each step but having a stored proc that encompases all steps is a bad idea. Using a DTS package that would call the sp to do the bulk insert and then the update and insert and finally a sendmail task would allow a greater control of the workflow.

"Shoot Me! Shoot Me NOW!!!"
- Daffy Duck
 
Isn't DTS just BCP on steroids?

-------------------------
Sometimes the grass is greener on the other side because there is more manure there !
 
Nope. DTS is a client application. bcp is a command line utility.

I don't like dts because it tends to get overused - mainly by people who don't understand databases and are happy with a gui that lets them get something working - but usually ends up a big mess.

Points of complication are change of process control and change of data ownership. Using a dts package to control processing (unless it is the whole system) will introduce a process control interface and also data ownership interface. If these can be minimised it will lead to a ssimpler system.

======================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
 
The answer to Modica82 question would be yes you could. As to the definetive answer of should you use DTS it depends on the scope and scale of what needs to be done.

I don't like dts because it tends to get overused - mainly by people who don't understand databases and are happy with a gui that lets them get something working - but usually ends up a big mess.
Is an over generalization. DTS has it's place in the MS SQL arsenal. The use of it should be based upon the needs of the system and the volume of work and the manner in which it needs to be done. If used as a tool to simply put a gui on bcp and other tasks then yes it is being misued. However if being used as a form of ETL tool then it does seem like a logical option if funds for a more robust ETL tool like Datastage or Informatica are not available. Any tool in the SQL Aresnal can be misused by those less skilled. People use enterprise Manager to rename a column or change it's datatype. People also use the Current activity in Enterprise Manager in place of sp_who and a few dbcc commands does this mean they are wrong.

To code an entire data load with error handeling and needed transforms into stored procedures could be seen as an ineficient use of time. Where as using a dts package to control the workflow and error handeling would be more effecient.


"Shoot Me! Shoot Me NOW!!!"
- Daffy Duck
 
Hi all,

should i create all the steps in stored procedures and then link them into my DTS package, i know this will probably be a misuse of what it is actually for, but i need a sequential route to be run and i need it to be scheduled.

I would like to code the whole set in DTS but i have no idea where to start to be honest.

Rob
 
You can do it either way.
You can schedule a stored proc or a dts package.
Or schedule a stored proc which runs dts packages or a dts package which runs stored procs.
Or create a job which has multiple steps running packages and SPs.

There are often requirements that make some solutions better than others but sometimes it is just down to the experience of the developer.
Keep anything you are unsure about or that is new to you separate.
Don't try to process data at the same time as you transfer it - that will make it difficult to change either task without affecting the other and also make testing more difficult.

In your case I would be tempted to make a lot of small steps - either in dts or SPs - Once the data has been imported SPs would seem to be the way t ogo.
That would mean you could test each step in isolation and just use the agent to run the sequence and monitor failures.


I like to control things from the database rather than a client so would make any dts packages small and for specific tasks - That makes it easy to change the way those tasks are performed with affecting the rest of the process.

======================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
 
This is a question for Nigel,

i have looked at the link you sent me, could you please explain the last stored procedure. I dont understand how you are using the subString functions to extract the data??

Rob
 
That is for a fixed width file so fields are always in the same place.
For a delimitted file it is probably easier to import into a staging table with the correct columns. I would do it into all character columns so that you can manipulate on the move to production and don't get problems with padding characters and date formats and such.


======================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
 
Do you mean instead of just making ##Import a table with one column, actually build the table structure and have it insert into that? How would i be able to achieve that, could you give me a basic sample?

Rob
 
You can just create a table with the correct number of columns - make them varchar.
then for a comma delimitted file

create table ##a(c1 varchar(100) null, c2 varchar(100) null, ...
bulk insert ##a from 'c:\bcp.txt' with (FIELDTERMINATOR = ',')

Then copy the data to production tables doing any reformatting on the way.

If you want a permanent table and don't want it to cater for different numbers of columns then you can bcp into a view or use a format file


======================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top