Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

insert into table

Status
Not open for further replies.

Guest_imported

New member
Jan 1, 1970
0
I have two tables A and B. Table A contains a series of records where the same field contains different values until the key changes. For example:

Key Value
1 abc
1 def
1 ghi
2 abc
2 def
3 abc
4 abc

Table B should look like:
Key Value1 Value2 Value3
1 abc def ghi
2 abc def
3 abc
4 abc

Is it possible to do it in sql? Any help is greatly appreciated! Thanks in advance.

J.
 
yes, it's possible, but it's very, very ugly sql

do you have microsoft access? it has a sweet crosstab feature

how about a reporting program of some kind? that you can do a loop with?

stored procedure? cursor?

for three values its a three-way left outer self-join

you sure you wanna?
 
Thanks for the reply.

We are an Oracle shop and we also have MS Access.
The logical programming way is to create a loop that writes the record and reset variables at the key change break point.

Do you know where I can find some sample SQL code that does this?

Thanks!

J.
 
I am working on the very same problem now (ORACLE 817).
It's using PL/SQL and dynamic SQL for the extensible part, not knowing how many values you have for each key.

1. I have a cursor (parent) for the unique set of "Key" values:

cursor key_list is
select distinct key
from tableA;

2. I have another cursor (child) for the set of "Value" values, with an argument of "ThisKey":

cursor value_list (ThisKey NUMBER) is
select
value
from tableA
where key = ThisKey
order by ???;

3. In my case, all Keys have the same number of Values, but I do not know what that number is ahead of time. I generate it. In your case, you need to know the max number so you can create table B.

select max(count(*))
into MAX_VALUES
from table
group by key;

4. Knowing the max, build a dynamic "CREATE TABLE TABLEB" statement, having a 'Value' column for each of them.

Something like this:
TABLEB_CREATE_STR:='CREATE TABLE tableb ( ';
FOR idx in 1..MAX_VALUES LOOP
TABLEB_CREATE_STR:=TABLEB_CREATE_STR||
', VALUE'|to_char(idx)||' VARCHAR2(20)';
END LOOP;
TABLEB_CREATE_STR:=TABLEB_CREATE_STR||') STORAGE ...';

Create TableB:
EXECUTE IMMEDIATE TABLEB_CREATE_STR;

5. Now that you have the TableB, loop through the parent/child cursors to get the individual values. For each "Value" selected for a key, add it into a local, dynamic VARRAY variable:

VArray and its type definition:
create type VLIST is varray(100) of VARCHAR2(20);
value_list VLIST;

Loop through Key/Values nested cursors:
FOR this_key IN key_list LOOP
FOR this_val IN value_list(this_key.key) LOOP

Stuff each "this_val" into the Varray structure:
(must use constructor for first use of Varray...)
(VARRAY acts much like a linked-list here...)

IF (loop-counter = 1) THEN -- Instantiate the VARRAY
value_list := VLIST(NULL);
ELSE -- Expand the VARRAY by 1
value_list.extend;
END IF;
IF (this_val.value IS NOT NULL) THEN
value_list(loop-counter) := this_value.value;
END IF;

END LOOP; -- child loop of values

6. Now that you have a varray of the values for this_key, insert the record into TableB using Dynamic SQL:

EXECUTE IMMEDIATE TABLEB_INSERT_STR;

-->Earlier, I generated the dynamic "INSERT INTO TABLEB"
statement like this:

TABLEB_INSERT_STR:='INSERT INTO TABLEB VALUES (';
FOR idx in 1..MAX_VALUES LOOP
TABLEB_INSERT_STR:=TABLEB_INSERT_STR||
', value_list('||to_char(idx)||')';
END LOOP;
TABLEB_INSERT_STR:=TABLEB_INSERT_STR||')';


7. Free up the VARRAY for the next set of Key values
and commit your insert

value_list.delete;
commit;

8. End the Key Loop and the script

END LOOP; -- parent list of keys
COMMIT;
END;
/

"Helping others to help themselves..."
=================================
Thomas V. Flaherty Jr.
Birch Hill Technology Group, Inc.
 
nice one, thomas

illustrating once again that denormalization is best done procedurally, even if it uses a database table to store the results

:)
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top