Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

importing large text files with DTS

Status
Not open for further replies.

bestuardo

Programmer
Jun 17, 2004
1
US
I have tried to use the DTS wizard to import a very large text file into a table of the form tbl(a tinyint, b text). The bulk of the data from the file is going in the text fields. Some of the rows in the file are under 1MB long and these import just fine. But the rows that are over 1MB long (some are as large as 200MB) won't import correctly. If I format the file to be delimited, the larger rows won't load at all. If I use fixed field, the rows get truncated at 8KB. According to the sql doc, text fields can hold up to 2GB of data, so in theory I should be able to import the text file. Any suggestions?

Thanks!
 
As I keep saying - bcp/bulk inser is much simpler (and faster) than dts

create table bcp (i int, t text)
create table bcpin (i int, t text)
insert bcp select 1, replicate('1',8000)

DECLARE @ptrval binary(16)
SELECT @ptrval = TEXTPTR(t)
FROM bcp
WHERE i = 1

UPDATETEXT bcp.t @ptrval null null bcp.t @ptrval
UPDATETEXT bcp.t @ptrval null null bcp.t @ptrval
UPDATETEXT bcp.t @ptrval null null bcp.t @ptrval
UPDATETEXT bcp.t @ptrval null null bcp.t @ptrval

select datalength(t) from bcp
128000

master..xp_cmdshell 'bcp test..bcp out c:\temp\tst.txt -c'
master..xp_cmdshell 'bcp test..bcpin in c:\temp\tst.txt -c'

select datalength(t) from bcp
select datalength(t) from bcpin
-----------
128000
-----------
128000


======================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top