Okay, so traditional eithernet says that 40% utilization is pretty much saturation, mainly because at that point statistically speaking collisions pick up to the point that effencieny begins to fall of.
But collisions are a thing of the past where full duplex is concerned so, in a switched environment, what is the reaalistic utilization before you'd consider the 'network' to be saturated?
I mean, I realize it's going to depend on what's on the end of the connections...slow server etc, may cause retransmission but, in general what's a good 'starting point' to mark monitoring thresholds when you have a switched 10/100/1000 environment?
But collisions are a thing of the past where full duplex is concerned so, in a switched environment, what is the reaalistic utilization before you'd consider the 'network' to be saturated?
I mean, I realize it's going to depend on what's on the end of the connections...slow server etc, may cause retransmission but, in general what's a good 'starting point' to mark monitoring thresholds when you have a switched 10/100/1000 environment?