Hi,
I have to load b/w 1-2 billion records from 1 data file into 1 or potentially more fact/dimension tables in my udb data warehouse monthly and was wondering - for capacity planning sake - if there are any best practices and guidelines I can heed before I begin designing my tables?
eg. can db2 tables even store tens of billions of records and perform ok? with this much data can i stick by traditional modeling principles- eg. 3nf or dimensional -- or should other techniques be used? eg. 1 table for each month partition (so, 12 tables for 1 year of data, 36 tables for 3 years etc..)?
Any help would be greatly appreciated!
I have to load b/w 1-2 billion records from 1 data file into 1 or potentially more fact/dimension tables in my udb data warehouse monthly and was wondering - for capacity planning sake - if there are any best practices and guidelines I can heed before I begin designing my tables?
eg. can db2 tables even store tens of billions of records and perform ok? with this much data can i stick by traditional modeling principles- eg. 3nf or dimensional -- or should other techniques be used? eg. 1 table for each month partition (so, 12 tables for 1 year of data, 36 tables for 3 years etc..)?
Any help would be greatly appreciated!