I had a small debate with a friend on transfer rates based on bandwidth and wanted some clarification.
Let say we have two isolated LANS.
one 100Mb Network and one 1Gb Network. (both copper)
if I transfer one file from point A to Point B using CIFS, would it take the same amount of time on a 100Mb network as it would a 1Gb network? (In theory) or, would it transfer faster in a Gigabit network? My argument was that regardless of CIFS' chatty nature, considering that in a gigabit network a signal can be received at a rate of 1 billion 1,0's per sec. i figured the request/response would occur much faster than it would in a 100Mb network. My colleague's argument is that because it is running on copper the time it would take for a packet to travel from point a to point b will always be the same and therefore regardless as to how fast your network is, in theory it will always arrive at the same time.
Is that true? Am i misunderstanding the concept of how fast a packet can travel? In school, my professor told me, that in copper wiring, the wiring is electrically charged on and off to signal a 0 or 1. So he goes on to say, if its a 10 Mb line, then a charge goes on and off 10 million times per second. SO, with that reasoning in mind, I figured if a gigabit network can transmit 1 billion 1,0's per sec, that that would mean one bit would be received at 1 billionith of a second because the actual electricity is going over a linear path.
Let say we have two isolated LANS.
one 100Mb Network and one 1Gb Network. (both copper)
if I transfer one file from point A to Point B using CIFS, would it take the same amount of time on a 100Mb network as it would a 1Gb network? (In theory) or, would it transfer faster in a Gigabit network? My argument was that regardless of CIFS' chatty nature, considering that in a gigabit network a signal can be received at a rate of 1 billion 1,0's per sec. i figured the request/response would occur much faster than it would in a 100Mb network. My colleague's argument is that because it is running on copper the time it would take for a packet to travel from point a to point b will always be the same and therefore regardless as to how fast your network is, in theory it will always arrive at the same time.
Is that true? Am i misunderstanding the concept of how fast a packet can travel? In school, my professor told me, that in copper wiring, the wiring is electrically charged on and off to signal a 0 or 1. So he goes on to say, if its a 10 Mb line, then a charge goes on and off 10 million times per second. SO, with that reasoning in mind, I figured if a gigabit network can transmit 1 billion 1,0's per sec, that that would mean one bit would be received at 1 billionith of a second because the actual electricity is going over a linear path.