Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Can you iterate through fields in a table?

Status
Not open for further replies.

stiej

Technical User
Jan 20, 2003
142
GB
Hi,

How, in T-SQL, can I get my cursor to iterate throught the fields in a given table outputing the values in each field?

I'm trying to build a sp to automatically go through all my tables, then all my fields, then build a bunch of INSERT INTO mytable (myfields) VALUES (myvalues) statements so I can populate the same tables with the same data on another server.

I've used sysobjects and syscolumns so far to identify my tables and fields and I've successfully built my INSERT INTO statements (including the field list after the word INTO) but only as far as the " VALUES(" bit.

I now need to actually get the values for EVERY record in each table - a seperate INSERT INTO statement per record, per table.

Any help will be much appreciated!
 
Have you considered replication?
How about linked servers with a INSERT ..... SELECT statement to populate the new server?
It seems you're doing a lot of extra work unless the new server is a different RDBMS.

-------------------------
The trouble with doing something right the first time is that nobody appreciates how difficult it was - Steven Wright
 
> I now need to actually get the values for EVERY record in each table - a seperate INSERT INTO statement per record, per table.

Why per record? Unless you want to make data dump .sql script or something...

------
"There's a man... He's bald and wears a short-sleeved shirt, and somehow he's very important to me. I think his name is Homer."
(Jack O'Neill, Stargate)
[banghead]
 
Well...what we have here, is a company policy if you like, that any new database schema is entirely scripted (tables, keys, indexes, constraints, SPs etc) including initial "start-up" data. This script will then get executed in the QA environment, then when everyone's happy, in the Live environment. Thus, the whole schema and it's intial data, can be recreated on any server whenever we want.

I have my script to do everyone except populate the tables I've just created.

So I thought I'd write a SP to go through everything and churn out a bunch of INSERT INTO statements to paste onto the end of my script that gets as far as recreating the "framework". The goal being, that when running the script, it creates the database AND populates it.

Thanks for your time...
 
What if you have millions of records
how long will it take until all the data is created and all the indexes are created?
what about data that is coming in while you are testing, how are you going to get that data
If you want exactly the same data and procs backup QA and restore to Live server (guaranteed to be same)

Denis The SQL Menace
SQL blog:
Personal Blog:
 
Thanks for the reply.

Number of records will be less than 100, probably less than 10, per table.

Data coming in will stop when we're ready to go live, and so my resolution here is to run this SP that I'm trying to write to go get what's in the database at that time and that would be good enough. Then we can create all this on the Live server and that's the job done.

I like the backup/restore idea, but for now, is there a way to get the values out of each field, for each record, for any table, just so i can have a script with insert into statements that satisfies the powers that be.

If not, we should rely on backup/restore I think.
 
OK. Let's suppose this must be done - company policy or whatever. What about foreign keys and other dependencies?

------
"There's a man... He's bald and wears a short-sleeved shirt, and somehow he's very important to me. I think his name is Homer."
(Jack O'Neill, Stargate)
[banghead]
 
Then, since the database as a whole will have good data in it before deployment to live and all keys are correctly scripted before any inserts, then all is okay.

I already have the script i want, except that I had to manually create a bunch of Insert Intos for each record in each table and it executes from top to bottom - from nothing to a fully populated database with constraints - in one go wihtout error. Not a long job to create manually the inserts, but a SP to do it for me when the time comes would be good.

And I'm 99% there, I just need a way to cycle through an unknown number of fields in a given table, for each record in the table. I can handle the formatting myself (i.e, apostrophes around strings, commas between each data item etc).

Perhaps, thinking more about it now, that ActiveX may be better, inside a DTS pack, creating a text file on my hardrive?

What I want, but can't seem to see, is extra functionality in a cursor to fetch all fields into as many destination fetch-variables as there are fields in the table, or fetch all fields from a table into some sort of collection that i can loop through, concatenating each appropriately into a values list for the VALUES part of the INSERT INTO.


Thanks for your time guys!

 
It is possible to get table structure; crawling over columns is another story. FETCH NEXT INTO what?

Better make small client-side tool. T-SQL is too... stiff for that purpose.

Try with bcp/BULK INSERT for each table. Or eventually ActiveX (ADO) - recordsets can be saved into XML or ADTG format. Personally I'd do it with bcp because a) it is MUCH faster, b) you can easily handle identity inserts (-E switch), triggers (turned off by default) and other pitfalls.

Whatever you choose, first step is to build list of tables based on dependencies. 'Leaf' tables (without outgoing foreign keys) come first, then tables that depend on leaf tables, then... you get the picture. That order matters for import only, not for export.

------
"There's a man... He's bald and wears a short-sleeved shirt, and somehow he's very important to me. I think his name is Homer."
(Jack O'Neill, Stargate)
[banghead]
 
Here's a suggestion. I use a tool called TOAD. It was originally developed as an Oracle tool, but this year the vendor (Quest Software) has created a SQL Server version. This tool has lots of neat features, but one thing it allows you to do is select tables and export them as script files - you can have the table creation script statements followed by the data insertion script statements. There are other export options as well, but this one does exactly what you want (I think).

Anyway you can download a trial version and try it out. The trial version is fully functional, just has an expiration date.

Go to
then go down the list until you find TOAD for SQL Server, then click the "Trial" link.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top