Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

transforming data

Status
Not open for further replies.

Machiaveli

Programmer
Dec 16, 2003
91
NL
Hi,

I've some textfiles which i would like to import. The problem is that there are over 1024 columns when separated.
This exceeds sql server's maximum columns.

The data in the textfiles which i want to import comes from a datacapture software. When separated there are columns that represents 1 columns, but they have an index descriped in another textfile which i don't import (see example).

formid fieldnr fieldname idx length
1 0 #Reference 0 11
1 0 #Date 0 6
1 0 #Time 0 6
1 1 question1 0 2
1 2 question1 1 1
1 3 question1 2 1
1 4 question2 0 2
1 5 question3 0 4
1 6 question4 0 2
1 7 question5 0 5
1 8 question6 0 5
1 9 question7 0 4

In the textfile which contains the data, all columns are included with no columname, but appear in order of the above example.

Now there should be a way to transform the data so that the columns where no value is specified are skipped.

Can someone help me on this?
 
Machiaveli:

You could import this text file into a work table with one (or more, if necessary) VarChar columns, not worrying for the moment about the real column divisions. Then, after you have the text file imported, you can parse the record correctly using a stored procedure.
 
do you have an example how this stored procedure would look like, i can learn from it.
 
Note: Your Import table should be truncated before importing because the DTS Data Pump Task only appends incoming data, it doesn't replace.


This first example uses an import table with one column named Column1 and a Cursor.


CREATE Procedure ProcedureName
as

Set NoCount On

Declare
@Column1 VarChar(100),
@Done Bit ,
@True Bit ,
@False Bit

-- Initialize
Select @True = 1,
@False = 0

-- Create CurSOR so we can step through the records.
Declare CursorName CurSOR
For
Select
*
From
ImportTable

Open CursorName

Select @Done = @False

While @Done = @False
Begin

Fetch Next
From
CursorName
Into
@Column1

If @@Fetch_Status = -1 -- No or no more records.
Begin
Select @Done = @True
End -- No or no more records.
Else -- Process Record
Begin

-- Processing Statements

End -- Process Record

End -- While @Done = @False

Close CursorName
Deallocate CursorName

Set NoCount Off

Go



This second example uses an import table with two columns; an Identity column named RecordID, and a second column named Column1. This is supposedly faster than using a cursor, but I haven't seen much of a time difference.


CREATE Procedure ProcedureName
as

Set NoCount On

Declare
@Done Bit ,
@True Bit ,
@False Bit ,
@Column1 VarChar(100),
@RecordID Int ,
@MaxID Int

Select
@True = 1,
@False = 0

Select
@RecordID = Min(RecordID),
@MaxID = Max(RecordID)
From
ImportTable

While @RecordID <= @MaxID
Begin

-- Make sure there is a record in the import table
-- with the RecordID.
-- This 'If' may not really be necessary, but I like
-- to cover all my bases.
If Exists (
Select
RecordID
From
ImportTable
Where
RecordID = @RecordID
)
Begin -- Process Record

Select
@Column1 = Column1
From
ImportTable
Where
RecordID = @RecordID

-- Processing Statements

End -- Process Record

@RecordID = @RecordID + 1
End -- While @RecordID <= @MaxID

Set NoCount Off

Go
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top