Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

ActiveX Script in Data Transform Task

Status
Not open for further replies.

paulkeys

Programmer
Sep 21, 2006
80
Hopefully someone will be able to help with the problem below.

I'm migrating a 2000 DTS Package to SSIS, and was wondering how to mimic the below behaviours of the DTS package in my SSIS package (migration using BIDS encapsulates the below in a Execute DTS task, and i'm trying to rebuild using the SSIS functionality)

The DTS Package has an Oracle Source and a SQL Server destination, and a transform data task between the two.

Within the transform data task, i have 2 transformations, one copies the varchar fields between the source and destination tables, and i also have an activex script that checks the validity of the datetime fields in the source, and passes either the date or null to the destination field.

I am trying to replicate in SSIS - any help would be much appreciated.

Thanks in advance
Paul
 
Should be pretty simple. THe following sounds like the structure you need.

within a Single data flow you would define an OLE DB SOurce (Your Oracle System) Within this you would specify the Table or query to extract your data. Then you would push it through a possibly a derived column or script component that would check and replace the data as needed. Last step is you OLEDB Destination pick your Datasource and table to insert your data.

Shoot Me! Shoot Me NOW!!!
- Daffy Duck
 
MDXer is correct use the derived comp to format the data. What I did for the same problem was to use a conditional split off the source and then a derived column to format. This way I could keep the data that had a bad date. Then I used the default condition to signal a good row and the first condition to branch off and "massage" the data. A gotcha with this is that you cannot "lock" the table on the destinations if they are going to be the same.
 
if you branch the data off to massage this then you can do a union all to bring the clean data back into the main dataflow and use the fast load with and lock the table. another way is to use a script task as a transformation and clean the bad data within it using VB.NET which doesn't require the branching of bad data. I currently use this method in some Data Warehouse ETL packages.

Shoot Me! Shoot Me NOW!!!
- Daffy Duck
 
Thanks for your feedback guys

I've implemented a script component and used a vb.net function to run on the date fields in the data flow, and return a modified value to the destination
 
MDXer. Thanks. I really never used the union all. I will have to give this a try.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top