Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chriss Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

bcp question!!!!Urgent!

Status
Not open for further replies.

mkey

Programmer
Oct 3, 2001
288
CA
Hi All,
I'm populating a table with let say one set of records. Then I am populating the same table with another set of records. Problem is when I do the second set of records there is some data that is already in my first set. So now the second set data is not pushed over because of the PK constraint error. Is there is a switch I could use so that the bcp utility would run without this interuption?

Thanks!
 
You can set the BCP -m option (max errors) to a high value and use the -e option to write the error records to a file.

I suggest using DTS or Bulk Insert in T-SQL to import the text files. You can gain better control over the load process and choose how to handle duplicate records. Terry L. Broadbent - DBA
Computing Links:
faq183-874 contains "Suggestions for Getting Quick and Appropriate Answers" to your questions in the SQL Server forum. Many of the ideas apply to all forums.
 
So let me get this straint. If I use these switches will the data be pushed into the table even if we have some data that is duplicate?
 
No. As long as the unique constraint exists on the table, BCP will write duplicate rows to the error file rather than the table. If you want to insert duplicate rows, remove the unique constraint. Terry L. Broadbent - DBA
Computing Links:
faq183-874 contains "Suggestions for Getting Quick and Appropriate Answers" to your questions in the SQL Server forum. Many of the ideas apply to all forums.
 
I guess I didn't explain my problem clearly.
Data set 1:
1$AA
2$BB
3$CC

Data set 2:
5$DD
6$FF
1$AA

Now when I bcp in the data set 2 I will get a pk error because of 1$AA. I tested this scenario by putting a space in the data set 2 file:

Data set 2:
5$DD
6$FF
<space>
1$AA

In this case I was able to push in the first two records and it gave me a error something like invalid character value for cast specification.
Also the -e error are not beging generated into a file. The error file is generated with now errors. Can you tell me why this is happening?
 
I apologize for the inaccurate info. Options -m and -e work as described. I assumed the options would apply to the duplicate row error. Not so. BCP simply stops when attempting to insert a duplicate row because SQL Server cancels and rolls back the entire transaction. I haven't used bcp much since the introduction of DTS and my ignorance shows.

You can alter the Index, adding the IGNORE_DUP_KEY option. With This option set SQL Server will issue a warning message and ignore the duplicate row. The row WILL NOT be inserted but the BCP will continue.

Alter the index like this.

Create Unique Index idx_name on tbl_name(col_name)
With IGNORE_DUP_KEY, Drop_existing
Terry L. Broadbent - DBA
Computing Links:
faq183-874 contains &quot;Suggestions for Getting Quick and Appropriate Answers&quot; to your questions in the SQL Server forum. Many of the ideas apply to all forums.
 
Thanks Terry. I will try your suggestion. Just a note the -e error switch doesn't work. I mean when I use this switch in SQL Server(7.0) the error file is created but with no error messages when I open the file. Do you know why this happens?

Again thank you for all your help!
 
-e does work. As I mentioned in the last post, nothing is placed in the error file in this instance because BCP is not rejecting the rows. SQL Server rejects the duplicate insert and stops process. BCP aborts and writes the error info to an output file (if you use the -o option) or to the screen. Terry L. Broadbent - DBA
Computing Links:
faq183-874 contains &quot;Suggestions for Getting Quick and Appropriate Answers&quot; to your questions in the SQL Server forum. Many of the ideas apply to all forums.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top