Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Import CSV file with Duplicates, While using unique indexes?

Status
Not open for further replies.

Taf

Programmer
Feb 17, 2002
1
AU
Does anyone know how you can import a CSV file or any delimited text file into a table which has unique indexes? The text file may have duplicate values contained and I need it to ignore these, not create an error import nothing! Any help would be appreciated, thanks.
 
One approach is to import into a staging or temp table first, then select distinct from it and insert into your final table. Robert Bradley
teaser.jpg

 
You can use DTS but yo have to specify that you use unique index.
 
Thanks for these answers, but when I run the DTS Wizard, it throughs up errors??

Any idea on how to increase the number of errors allowed?

Regards,

Thierry
 
In Enterprise Manager, open the DTS package and double click the transformation that is copying the text file to SQL Server. Click on the Advanced tab. There you can set the Max error count and designate an Exception file name.

You must also turn off the Fast Load option. Terry
 
I use a 2 step import, import into a SQL DB with Unique Index and Ignore Duplicates turned on. This throws away the duplicates. Then you can import into a DB with a defined Primary Key Index because you won't have any duplicates.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top