Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Data Modeling in Teradata

Status
Not open for further replies.

susheeltips

Programmer
Nov 8, 2001
60
US
Hi!

I would appriciate if someone could help me in knowing the technologies to perform the Data Modeling (Logical and Physical) in Teradata RDBMS environment.

Thanks in advance,
 
The methodologies to translate your logical model into a physical model are far too lenghty to describe in this type of a forum. I would suggest that you download some of the documentation from the NCR technical pubs website at One of the manuals of interest to your search would be the "Database Design" document number 1094061A. However, if you are not familar with Teradata I suggest that you get some help with this. This is not just cookbook procedures but all of the numerous inter-related factors should be considered. If you are concerned with performance then you should rely heavily on the details in the documentation coupled with the experience of someone that has implemented several data warehouses.

Also, be aware that if you work with NCR that you may get the standard NCR sales pitch to keep your model as close to 3NF (third normal form) as possible - because it "models the business rules". Yes, this is true but it does not always (and in fact seldom) align with how the users want to use the information. It is not normally in your best interest but does sell a lot more iron. I know because I used to work for NCR.

I suggest that you first understand the business problems you are trying to solve. Then prioritize them by value if this can be done. Secondly, understand the projected business questions and translate these into query plans. If you already have a model you should leverage the queries already being executed. The bottom line is that you should build the model that best aligns with the use of the information. I realize that usage will change over time but it is likely that you can address most if not all of the basic information needs with a single model. Denormalize where you can reduce joins or other resource consumptive processing that is of high frequency.

The above task is not easy but if done with consideration of these factors you will end up increasing your chances of success. Let me know if you have further questions.

Doug Drake
(MOZC)
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top