Hi all,
I've set up a DTS package that imports data from an ODBC connection into SQL. There's 6 tables who's details can change, so there's a step for each of them that clears the SQL table, then imports the data. Works fine.
Then there's table 7 which is a transaction history file. Only new records are added to the ODBC source, and existing records can never change, so it makes no sense to clear the SQL mirror each time and import everything again - so I set up a unique index on the SQL table so that duplicates would fail. The process still steps through the million or so records, then reports a failure (due to the index), but still imports the new records.
The problem is the package reports a failure. I could set the error count for that step to 9 million or something, but then it won't trap errors I may be interested in.
How should I go about handling this?
Thanks.
I've set up a DTS package that imports data from an ODBC connection into SQL. There's 6 tables who's details can change, so there's a step for each of them that clears the SQL table, then imports the data. Works fine.
Then there's table 7 which is a transaction history file. Only new records are added to the ODBC source, and existing records can never change, so it makes no sense to clear the SQL mirror each time and import everything again - so I set up a unique index on the SQL table so that duplicates would fail. The process still steps through the million or so records, then reports a failure (due to the index), but still imports the new records.
The problem is the package reports a failure. I could set the error count for that step to 9 million or something, but then it won't trap errors I may be interested in.
How should I go about handling this?
Thanks.