We're migrating one of datasets to a new warehouse. In our data, I have a fact table with 350,000 rows in it. The new warehouse designer has migrated the data and now we have 12.5 million rows in his new fact table. (he made sure that every dimension (hospitalid, measureid, reportingperiodid) were loaded from our lookup tables. So, 12.3 million of the 12.5 million rows have nulls in all fact columns.
We're trying to run benchmarks and percentiles against the whole fact table and its taking 3-4 minutes to run, whereas on our old system it only takes about 20 seconds to run.
He's pretty adamant about not changing this model.
I don't understand his reasoning, could someone enlighten me as to why loading every possible element is efficient.
We're trying to run benchmarks and percentiles against the whole fact table and its taking 3-4 minutes to run, whereas on our old system it only takes about 20 seconds to run.
He's pretty adamant about not changing this model.
I don't understand his reasoning, could someone enlighten me as to why loading every possible element is efficient.