Let me preface this, as i'm thinking more of the SOHO market. Also, i'm not exactly sure which forum to post this into ~ so if it needs to be moved, please do.
It seems that a vast majority, if not all, Gigabit devices claim, 10 times faster than 100 ~ duh * drools * These include routers, switches, adapters, and hubs. However, the IEEE spec doesn't include jumbo frames.
On nearly every device i can think of, in a Windows environment, when you send or receive, you max out at 100Mbs ~ 10% usage. By design in Windows, the native WHQL drivers ( or those on a Disc ) do not support jumbo frames. You have to tweak settings, which may or may not get you to the 100Mbs limit ( typically you'll be going at 5% ). Moreover, a switch or router usually has gigabit across 4-5 ports, or 200-250 Mbs per slot.
Honestly, my home network is nutz trying to achieve max rates between devices. The Internet isn't much an issue, as it's controlled and limited by my ISP. Here or there a download to a common location, but usually less than 20MB, and then used by multiple systems; DLNA is a great thing.
So i have a modem, to a router, to switches for wi-fi ( cell phone, printer, friends tablet's and laptops ), Fast ( VOIP, Blu-Ray, my non-jumbo frame devices, Xbox ), and Jumbo ( NAS, a few pcs ). I've spent a fair amount of time trying to optimize my home network for speed. I deal with a lot of video files and editing. The files are on my network ( as are backups ), so my network speed is really critical.
I guess what i'm asking is, is it really fair to claim 10 times faster than Fast, if you really are only getting Fast speeds? It's like buying a car that has a Speedometer that goes to 100mph, but only lets you drive at 10 mph, after you tweak things, but 5mph normally. Just how are people supposed to get to even 50% usage? How does the common Gigabit device really improve performance without Jumbo Frames?
It seems that a vast majority, if not all, Gigabit devices claim, 10 times faster than 100 ~ duh * drools * These include routers, switches, adapters, and hubs. However, the IEEE spec doesn't include jumbo frames.
On nearly every device i can think of, in a Windows environment, when you send or receive, you max out at 100Mbs ~ 10% usage. By design in Windows, the native WHQL drivers ( or those on a Disc ) do not support jumbo frames. You have to tweak settings, which may or may not get you to the 100Mbs limit ( typically you'll be going at 5% ). Moreover, a switch or router usually has gigabit across 4-5 ports, or 200-250 Mbs per slot.
Honestly, my home network is nutz trying to achieve max rates between devices. The Internet isn't much an issue, as it's controlled and limited by my ISP. Here or there a download to a common location, but usually less than 20MB, and then used by multiple systems; DLNA is a great thing.
So i have a modem, to a router, to switches for wi-fi ( cell phone, printer, friends tablet's and laptops ), Fast ( VOIP, Blu-Ray, my non-jumbo frame devices, Xbox ), and Jumbo ( NAS, a few pcs ). I've spent a fair amount of time trying to optimize my home network for speed. I deal with a lot of video files and editing. The files are on my network ( as are backups ), so my network speed is really critical.
I guess what i'm asking is, is it really fair to claim 10 times faster than Fast, if you really are only getting Fast speeds? It's like buying a car that has a Speedometer that goes to 100mph, but only lets you drive at 10 mph, after you tweak things, but 5mph normally. Just how are people supposed to get to even 50% usage? How does the common Gigabit device really improve performance without Jumbo Frames?