Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Relational table import design question

Status
Not open for further replies.

SQLSister

Programmer
Jun 18, 2002
7,292
US
OK, it's like this - I have a situation where I have to import data from mutiple related tables into a structure with different related tables. So far not too bad. In the past when I've done this, all the data came from one source with one data structure, so I used the primary/foreign key data as it was structured in the orginal tables in my design for the tables I was going to use the imported data in.

Unfortunately, in this case I will be receiving data from 50 separate data sources; each with a completely different database design and structure. To make matters worse, our marketing people haven't actually negotiated with all the data providers yet, so I am designing without a real idea of the scope of the whole problem. However, based on the data I do have access to, it is a certainty that the Primary keys will not be compatible between the different data sources. Each data source will have several hundred thousand records. Data will be imported weekly.

So I need to convert the relationships of these tables to the primary/foreign key structure I have designed. Right now the only way I can think to do this is to import the tables into staging tables. Then send the data to the main tbale from the main table of the state I am processing. At this point, I would update the Staging table with the new ID field and use triggers on it to update the related staging tables. Then I would import their data to the appropriate places. I was thinking I would have to use a cursor to do this, but I just realized how I can write the update query to avoid that as I was writing this. See you've half solved my problem already.

My real question though is does this process make sense? Is there a better way to approach this?



 
Hi there
I'm not professing to be anything even vaguely resembling an expert, but i would have thought your scenario would be the only way to go about it, given the disparity of the 50 data providers' db structures.

Unless a possibility would be to denormalise the incoming data into staging table(s) , and then re-normalise it to fit in with your structure when you import into your DB...?
This is of course presuming that the structures of the 50 DBs would allow for a more all-encompassing denormalised table, and that the complexity of the queries etc needed for the process wouldnt turn out to be ridiculous!

Hope this makes sense
Good luck
LFCfan
 
Thats the way to go. Personally I almost always use staging tables it is much easier to verify and scrub the data that way, before it gets into your live data.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top