Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Importing rows in text file to columns in SQl server

Status
Not open for further replies.

vime

Technical User
Aug 18, 2003
5
GB
Hi all

Wondering if anyone could help, I urgently need to import mulitple .txt files with the format of

UsersName=david
TimeLoggedIn=12/12/01
Location=London

into a SQl Server table with a format of

UsersName=david TimeLoggedIn=12/12/01 Location=London

Any help will br greatly appreciated

Thanks

 
This is a very straightforward DTS job. First, define the flat file layout (data source). Then define the database layout; DTS will even create the table for you. Then simply map each field in the flat file to the corresponding column in the database table. In fact, DTS will automatically map them for you if you create the table columns in the same order as they appear in the flat file.

Sometimes the grass is greener on the other side because there is more manure there - original.
 
Should be even easier via a bulk insert - look it up in bol.

======================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
 
Thanks for your input, problem I have is

UsersName=david
TimeLoggedIn=12/12/01
Location=London

is this this one record, the original file has 500 lines in one column eg


UsersName=david
TimeLoggedIn=12/12/01
Location=London
UsersName=david
TimeLoggedIn=12/12/01
Location=London
UsersName=david
TimeLoggedIn=12/12/01
Location=London
etc...

How can I map the relevant line to the relevant columns ie all usename in ColA Timeloggedin in ColB and location on ColC

Thanks
 
I had a similar problem and this is how I solved it and it worked perfectly:
Step 1: create a staging table and 3 temp tables:
Create table tbl2 (col1 varchar(80))
Create table #temp1 (ctr int identity,Data1 varchar(80))
Create table #temp2 (ctr int identity,Data2 varchar(80))
Create table #temp3 (ctr int identity,Data3 varchar(80))
Step 2: Create a trigger on your staging table as follows:
Create trigger trg1 on Tbl2 for insert as
begin
Declare @variable varchar(50)
select @variable=col1 from Inserted
if Left(Ltrim(@variable),4)='User'
Insert into #temp1 select @variable
if Left(Ltrim(@variable),4)='Time'
Insert into #temp2 select @variable
if Left(Ltrim(@variable),4)='Loca'
Insert into #temp3 select @variable
end
Step 3
Import data from the file into your staging table using bulk Insert like this. Make sure you specify the fire_trigger option
BULK INSERT Northwind.dbo.Tbl2
FROM 'C:\mylogfile.txt'
WITH
(
FIELDTERMINATOR = '\n',
ROWTERMINATOR = '\n',
FIRE_TRIGGERS
)
As the bulk insert inserts rows in the staging table TBL2, the trigger populates the 3 tables.After the imports completes you can join the 3 tables to create a final table;
SELECT Data1 as 'UsersName',data2 as 'TimeLoggedIn',data3 as 'Location' Into MYFinalTable FROM
#temp1 INNER JOIN #temp2 on #temp1.counter=#temp2.counter
INNER JOIN #temp3 on #temp1.counter=#temp3.counter

Voila!!!!
Bertrandkis.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top