This was asked by a student, and I can’t find the answer. We have to agree to set/read the wire at a specific rate. In the real internet, data is sent at different rates by different devices. What’s the protocol that makes that possible?
Unfortunately I don’t know the answer to this one.
To clarify, are you asking how do the two devices agree on the same rate? I’m guessing different devices are capable of transferring at different rates but ultimately must sync their rates when sending/receiving (and that rate probably fluctuates over time, thus some method to periodically make sure the devices’ bit rates remain synced) - with the max synced rate capped at the slower device’s max rate.
Yes, that’s the question. The best I can imagine is sort of like radio systems, where there’s a default channel (timing in this case), and then ok that channel you both agree to change to a different frequency. That would also explain why transfer rates fluctuate during the same connection or download, since a computer could be switching timings several thousand times a second if it was needed for the best speed. But can anyone verify that this is an accurate analogy?
At the base level every “standard” communicates differently with different protocols. Built into those protocols would be the timing for that protocol. Each hop your packet is taking is going from one device to another using a specific protocol between them. In wired communication, a Cat 6 cable contains four pairs of copper wire and uses all the pairs for signaling. So I don’t think send and receive are on the same wire. In wireless there are the different 802.11 standards. I looked for 802.11n and here’s a brief description: 802.11n uses up to four data streams can be sent simultaneously using 20MHz or 40MHz channels, providing a theoretical maximum data rate of 600 Mbps. See MIMO.
Not sure that helps!