Well, yes sort of - and anyway this was exactly JMCG's original problem. "admin areas of a site that are ranking higher than the main pages". Ie being listed.
I've personally found myself that even if you do robots.txt AND meta voodoo, pages can still get both listed and spidered if there are links pointing there, especially from other sites. It shouldn't, but it does. Maybe it's simply that once they're in, they didn't get revisited. Or a timing issue.
Ie imagine this, say the page is in one of the Google indexes and suddenly us humans slap a robots.txt "don't go there" instruction on it and "noindex". Well the robot just does what we tell it to. It knows about the page, tries to go there, but because it's a good little robot obeying robots.txt it doesn't go there because of the exclusion. So it doesn't get a chance to see it's got noindex on there now. Lots of variants on this - after all computers just do what we tell them to.