Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations John Tel on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

BULK INSERT

Status
Not open for further replies.

ajc123

Programmer
Oct 6, 2001
25
US
I'm sending some data to a client to load. I want to give the client an easy way to load the data -- simple for them and quick for me to prepare. The easiest way I can see to do this is to write a load_data.sql file with a series of BULK INSERT statements.

Some of the original data is in SQL SERVER 7.0, and the target is (currently) SQL SERVER 7.0. Others are in MS EXCEL. I tried two exports, first into an EXCEL .csv file. Using BULK INSERT resulted in the quote marks (") around each field being imported into the database as part of each value. Not good.

Then I tried into a tab-delimited .txt file (fortunately there are no Tab characters embedded in the data). This has the result of _sporadically_ importing Tab characters into two of the later (and longer) columns, occasionally concatenating the whole next row into the last column value. This stinks, and I don't see the pattern to it.

Is there something I'm doing wrong? What are my alternatives for a quick way to share data?

Thanks,

Avery Cohen

PS
I tried BCP, but get errors with even the most simple requests - SQL Server does not exist or access denied, perhaps due to the fact that I've been using NT Authentication to connect to SQL Server?


 
It will be a good option to have the column delimiter to ||, - 2 pipe characters with comma and column delimiter as ||\n - 2 pipe character with new line character. and you won't face any problems during the bulk insert. As this combination is a rare combination in the field values.

If you are not clear about the answer please reply.

Thanks & Regards

Linto Mathew
linto_mathew@hotmail.com
 
I don't see how BULK INSERT will handle the pipe (|)character as a delimiter any better than the quote (") or Tab deliminter.

I was looking for a quick-and-dirty way to create a script to share data.

There are two ways to export data to flat files -- CSV's which put quote marks (") around all fields and commas between fields, and TXT's which put tab characters between fields and no quote marks.

The problem I was having was that the BULK INSERT of one of my files was having sporadic (but regularly occuring) problems. I don't know what that was related to.

I ended up using a script with BULK INSERT and TXT files for four of the tables and manually importing a CSV for the one that wouldn't work with BULK INSERT. Some time savings for my client, but a long way 'round for me trying to get it to work.

I'm working toward having a good tool set for Production Migration. DTS doesn't work because you can't change the location of the source files for loading. I may have to learn the DTS Object Model and build it in VB, but that's not quick.

 
I think, am not clear about your question correctly. What I understood is you have some tables in some locations and you require some easy methods to export to a client location in a easiest method.

Why don't you write some simple VB or C++ programs which generate some text files from the tables you specified and in the client side you can bulk insert the data with the help of some scripts. But when doing BULK INSERT it is better to avoid the use of "," as the field terminator as there are chances of having "," inside the field value. So it will be better to fix the delimiter as the pipe character, the problem is you may have to write some programs to generate the text file. Hope this will solve your problem.
 
Have your tried a DTS package, quite simple to use
though very effective.

Rosko
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top