Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations John Tel on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

sql server equivalent for oracle %rowtype? 1

Status
Not open for further replies.

gacaccia

Technical User
May 15, 2002
258
US

i'm trying to port an oracle script to sql server that makes use of cursors. in oracle, i can fetch from a cursor into a record that mirrors the cursor structure using %rowtype.

for example...
Code:
cursor c is select * from t;
r c%rowtype;
open c;
fetch c into r;

can i do something similar in sql server, or do i have to fetch into a single variable for each table column (fetch next from c into @f1, @f2, @f3, @f4)?

thanks,

glenn
 
There is no equivalent for %rowtype in SQL Server... But the Cursor could be replaced, with set based T-SQL statements. It is better to replace cursors as far as possible to get better performance.

Can you describe what you are trying to achieve with the Cursor, with some data and expected output?


Sunil
 
basically, there is a database with two tables("tbl" and "var") that describe properties for all tables and fields used by an application. i need a script to verify the integrity of the data using a text file equivalent of each.

in oracle, i create an external reference to each text file, create a cursor for each, and then step through the records in the cursor, verifying column by column for each record that they match the table in the database.

sample data...

TBL
1 t1 i 30
2 t2 f 45
...

VAR
1 1 c1 2 2 0 B 0
1 2 c2 1 24 2 N 3
2 3 c1 8 1 26 S 0

the output is simply any case where the value for a give colum in a given record in the database is different than the same field in the "master" text file.
 
..describe properties for all tables and fields used by an application
if you are trying to query the DataTypes,fieldnames etc,in SQL Server you can query the System Tables to find information about the Database objects lik sysobject, syscolumns etc and some of the INFORMATION_SCHEMA views.

Now instead of parsing through the files and doing a line by line check you can put this file data into a staging table using Bulk Copy utility(bcp command). Once, you have data into a staging table you could use normal select statements to check your data. I think this will be faster.

Another thing is that this operation you are doing seems to be a Ideal candidate for a Sql Servers DTS(Data Transformation Services ) operation. There is a forum in Tek-tips for DTS questions forum961

Here is a link on msdn site for Bulk copy.

And here is alink to DTS documentation in msdn

Sunil
 
actually, the two tables (tbl and var) describe properties that are unrelated to SQL Server and the information cannot be obtained through the SQL Server system tables.

can you give an example of how you would use a select statement to verify that two tables are identical in content(and report on all differences), as that is basically what I need to do. I can see how to do that with a cursor, but I'm drawing a blank on using a select statement.

 
Can you provide some sample data in your text file and what is the logic you are using to do this comparison? And also, what exactly is the purpose of the tables TBL and var?


Sunil
 
Sample data is as posted above. I assume you don't want to see the entire script. The basic logic is...

-Create cursor for master copy of "tbl"
-create cursor for live copy of "tbl" with a parameter for table name
-for each record in master cursor, open live cursor with table name parameter to return single record
-if record not found, report that entry is missing
-if record is found, compared each field in live cursor record with each field in master cursor record and report any differences

Same general process is used for master and live "var" tables. As far as the purpose of these tables, the application that uses the database was originally designed with a flat file system. These tables act as a translator of sorts so that the application interacts with a SQL or Oracle database as if it were the original flat file system.
 
Will something like this work? I created 2 temp tables with sample data you gave.. The final select statement displays results categorized as below..
All Columns Don't Match
All Match
Row Missing


Code:
create table #Master(f0 int, f1 varchar(4), f2 varchar(4), f3 int)
insert into #Master values(1,   't1',   'i',   30)
insert into #Master values(2,   't2',   'f',   45)

create table #Live(f0 int, f1 varchar(4), f2 varchar(4), f3 int)


BULK INSERT #Live
   FROM 'c:\import.txt'
   WITH 
      (
         FIELDTERMINATOR = '   ',
         ROWTERMINATOR = '\n'
      )

select * from #Master
Select * from #live



Select *, 
case 
when Total = 0 then 'All Match' 
when Total = 1 then 'All Columns Don''t Match' 
when Total = 3 then 'Row Missing' 
end  
from #Live  t1 
inner join
(select t1.f0,
case when t.f0 = t1.f0 then 0 else 1 end +
case when t.f1 = t1.f1 then 0 else 1 end +
case when t.f2 = t1.f2 then 0 else 1 end Total
from #Live t1 
left outer join #Master t on  t1.f0=t.f0) t on t1.f0=t.f0

Sunil
 
The Sample data in my import.txt looks like this
Code:
1   t3   i   31
2   t2   f   45
3   t1   j   32


Sunil
 
thanks for the example! While unrelated to my initial question, I did not realize the you could use "bulk insert" to load text files from query analyzer. I also did not know how to declare temporary tables.

While the select statement approach certainly requires less code(and I imagine executes much faster), the output is not as clear as what I need. Basically, I'm looking for output like...

t1.type = 0 should be 5
t4.size = 64 should be 69
t11.pad = B should be S
 
Ok.. Try this code.
Code:
create table #Master(f0 int, f1 varchar(4), f2 varchar(4), f3 int)
insert into #Master values(1,   't1',   'i',   30)
insert into #Master values(2,   't2',   'f',   45)
insert into #Master values(3,   't4',   'f2',   50)
/*
1   t3   i   31
2   t2   f   45
3   t1   j   32
*/

create table #Live(f0 int, f1 varchar(4), f2 varchar(4), f3 int)


BULK INSERT #Live
   FROM 'c:\sunil\import.txt'
   WITH 
      (
         FIELDTERMINATOR = '   ',
         ROWTERMINATOR = '\n'
      )

select * from #Master
Select * from #live


Select t1.f0, 
fieldDiff
from #Live  t1 
inner join
(select t1.f0,
case when t.f0 <> t1.f0 then 't1.f0 = ' + cast(t1.f0 as varchar) + ' should be ' + cast(t.f0 as varchar) else '' end fieldDiff
from #Live t1 
left outer join #Master t on  t1.f0=t.f0
Union
Select t1.f0,
case when t.f1 <> t1.f1 then 't1.f1 = ' + cast(t1.f1 as varchar) + ' should be ' + cast(t.f1 as varchar) else '' end fieldDiff
from #Live t1 
left outer join #Master t on  t1.f0=t.f0
Union
Select t1.f0,
case when t.f2 <> t1.f2 then 't1.f2 = ' + cast(t1.f2 as varchar) + ' should be ' + cast(t.f2 as varchar) else '' end fieldDiff
from #Live t1 
left outer join #Master t on  t1.f0=t.f0
Union
Select t1.f0,
case when t.f3 <> t1.f3 then 't1.f2 = ' + cast(t1.f3 as varchar) + ' should be ' + cast(t.f3 as varchar) else '' end fieldDiff
from #Live t1 
left outer join #Master t on  t1.f0=t.f0
) t on t1.f0=t.f0 and t.FieldDiff <> '' order by t1.f0


Sunil
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top