Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

CSV file to SQL SERVER 1

Status
Not open for further replies.

Ctussey

Programmer
Sep 17, 1999
15
0
0
US
I need to convert a CSV File (beyond the normal problem of the comas, each two rows need to become 1 row in SQL Server (6.5)). Any help would be great.
 
Can you use VB? Have you used BCP before? The bcp.fmt file tells bcp what the format of the file is.
 
The method that I was going to use, is to import the CSV to an Access table (with commas and such left in). Then use VB to parse out the information (using the instr) to break at the commas, and populate the text boxes. Then... call out to SQL Srv, and enter the data to it. In theory its sound, but I wanted to see if there was an easier and less time consuming way to do this.<br>
As to the BCP, it doesn't ring a bell with me. If its an easier method, I'm all for it.<br>
Thanks
 
BCP is the way to go. It's a command line utility but if I'm not mistaken there is an object front-end that can be manipulated in VB/VBA.<br>

 
Have you tried using the Data Transformation Services? Or isn't that available in 6.5?
 
Yes I would recommend BCP. However, as an alternative I have used the following in the past, wrapped up in a Transact SQL stored procedure:<br>
<br>
1. Create a (temporary) table to hold the file. eg.<br>
<br>
create table #input_file ( line varchar(255) NULL)<br>
<br>
2. Use an extended procedure to run a command shell, issuing a 'type' command on the file:<br>
<br>
insert into #input_file (line) <br>
exec master..xp_cmdshell 'type c:\filename.csv'<br>
<br>
This loads the output of the command shell (ie. the file contents) into your table.<br>
<br>
3. Now write code to do substring manipulation to extract the columns from your csv data eg.<br>
<br>
update target_table<br>
from #input_file<br>
set target_table.time_stamp = convert(datetime,substring(#input_file.line, 1, 15))<br>
...<br>
etc etc<br>
<br>
4. Finally, if you've created a temporary table remember to clear it down and blow it away.<br>
<br>
One adantage of this method over BCP is that it fires any triggers that you've defined. <br>
<br>
Your DBA may complain about security - the 'xp_cmdshell' runs under the A/C that starts the DB, however they can create a separate NT A/C just to run xp_cmdshell, should they feel strongly.<br>
<br>
<br>
Hope that this helps.<br>
<br>
Incidentally my original use for this was to execute a 'dir /s' command in the shell to recursively build up a file tree and store file details in a DB table. <br>
<br>
Cheers,<br>
<br>
<A HREF="mailto:Martin.Ramsay@company-net.co.uk">Martin.Ramsay@company-net.co.uk</A>
 
Should have added to my original post ...<br>
<br>
To manipulate 2 rows at a time as you asked in your original request then you could process the table using a cursor, extracting the data into @variables and then processing the results from the 2 previous rows read in to the cursor using TSQL to do the concatenation.<br>
<br>
Cheers,<br>
<br>
Martin<br>
<br>
<p>Martin Ramsay<br><a href=mailto:Martin.Ramsay@company-net.co.uk>Martin.Ramsay@company-net.co.uk</a><br><a href= employer</a><br>Looking for interesting work in Canada - any offers ?
 
Martin, <br>
Excellent tip. I hadn't realised that the results from an xp_cmdshell could be inserted into a table. + as a cunning way of importing data it could be invaluable.<br>
<br>
Jez.<br>

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top