Okay I think we should be able to settle some of your concerns, I've pasted the following from your response
"Almost every query will be using NVL. Each will
retrieve thousands of rows"
Queries returning thousands of rows is bread and butter stuff to Oracle. So long as you have Indexes in the correct place and store your data sensibly to help you build an effecient query you should be fine.
", and there a hundreds
of clients running those queries simultaneously."
Again, this shouldn't be a problem to Oracle. Oracle will do two BIG things to help you here. Firstly so long as the queries your clients are running are syntaticlly identical Oracle will be able to re-use pre-parsed queries. Parsing a query is what turns it from your select ... from ... where to a query plan which states which tables Oracle will access, in what Order and how (Indexes etc.) With the advent of the Cost Based Optomiser this can be quite an overhead and significant performance gains can be found by ensuring the parse rate is kept low. Ensure you are using bind variables in your queries appropriately and that your SQL follos some standard so that they all look of a uniform nature.
Secondly, Oracle operates a buffer cache of often used data, so that if different clients are accessing the same info it is read from memeory rather than repeatedly from disk. This area can be tuned for better performance, look up info on the SGA (Shared Global Area) as this is what it is part of.
Futhermore Case is still a function (and a more costly one at that) and so you will still have the same problem if you use that.
Finally, Performance often seems to be a bottomless pit or resource usage. You could spend years getting a 5% increase in a queries performance, and then after that sopmeone could suggest a different solution and you would be back to square one. IMHO the best plan is to state, upfront before code is written the required performance and then when that is achieved stop tunning.
HTH,
Mike.