I'm creating a SQL database for a client and in a recent implementation discussion, was instructed not to use RI. This confused me as I've never developed a normalized relational database without implementing some kind of RI - whether it be through DRI or triggers. It was explained to me that RI consumes too many resources on the server and I should enforce cascade updates and deletes at the application level.
Can anyone briefly enlighten me on this subject and whether the argument has merit? In particular, are the performance trade-offs in implementing RI vs. not implementing it that significant?
The db I'm developing is largely used for analytical as opposed to transactional processing. It is primarily a monthly aggregation of disparate data sources into one normalized db used for reporting. There are very few row-level updates.
Thanks,
Rick
Can anyone briefly enlighten me on this subject and whether the argument has merit? In particular, are the performance trade-offs in implementing RI vs. not implementing it that significant?
The db I'm developing is largely used for analytical as opposed to transactional processing. It is primarily a monthly aggregation of disparate data sources into one normalized db used for reporting. There are very few row-level updates.
Thanks,
Rick