I am trying to make sure I did the bit rate correctly. Conceptually, it makes sense that the bit rate would be slower than the latency because you have to have a timed protocol just to keep the message clear. Many groups are using a code like “AAA” to indicate that the sender will start the message and “BBB” to indicate that the message is over so the time to send those bits is added making the bit rate just to get the message in between those two. My question is that I didn’t mention that the math should add the time for those “begin” and “end” bits but not count those bits as being sent. so they just make the bit rate slower. I’m looking for confirmation that I can correct this tomorrow and tell them that even though those indicators may be necessary, they make the bit rate slower so maybe choose a shorter indicator like, “A” so as not to slow the message down so much.
So, first thing first: I’m not sure there are 100% right answers here, only arguments that we can make because these terms like latency and bitrate do have some amount of ambiguity in common practice. I think I follow your argument. Let me see if I can say it in a different way and you tell me if this is what you mean…
I think there are two important distinctions to make: (1) what is the total amount of time it takes for an entire message to get through to the recipient? (2) What is rate at which bits are sent as part of this system?
“Bitrate” is typically a measurement of the system. So if the students are sending, let’s say, 1 bit per second, then the bitrate is 1 bit per second, no matter how many bits they are sending.
“Latency” is also typically a measurement of the system, not the message, but is a little confusing at this point because the students’ protocol requires this metadata - using AAA and BBB. So you could argue that there is some latency for the message itself – the first bit of the message intended for the recipient starts with the first bit sent after the AAA. If it takes 3 seconds for the AAA to get through, in other words, you could argue that the message latency is 3 seconds.
However, that argument is about the latency of the message, not the system. As a measurement of the system the latency is just 1 second - the amount of time it takes to convey the first bit (which would be the first A of the AAA header).
So, I’m not sure that helps As for your students and their protocol…I would suggest they don’t need the AAA at all. If they know when the starting time is, they can just read the first bit after that point. I would argue they need to know the starting time to detect 3 As in the first place…so they must already be doing this. If they’re already detecting the first bit after the start time, why not just make that the first bit of the message.
As for the end of message…if they can agree as part of their protocol how many bits make up an entire message, then they don’t need the BBB either. If they want a variable-length message though, then having an end-of-message indicator makes sense.
Does this help?
Baud Rate vs bit Rate : bauds are the symbols per second while bit rate is data per second. If you have a start and stop symbol you would have 2 additional symbols per group. This is often a matter of confusion, and like so many other similar terms often used interchangeably. Different forms of encoding also disguise bit rates - sometime a baud may represent more than one data bit. I know this is not an answer to Unit 1 Lesson 3.
Rhetorical Question: How can we teach these vocabulary words when the real world has so many definitions?
I have a similar challenge with each of these vocabulary words: bitrate, latency, and bandwitdh. Bandwidth seems out of place here (except in the larger scope presented in the supporting video).
The “discovery” element of these lessons allows for a multiple of solution types, some of which introduce larger-than-bit-focus complications (e.g. message overhead which is not specifically addressed yet).
Rather then redirecting the students’ effort away from those complications, I found myself letting them take it where they wanted to go with it. Although it does present a problem for me, the teacher, in explaining the formula on the worksheet to calculate bitrate. Maybe there should be “bitrate” and “message rate” to accommodate those students who add message overhead…?
Latency can also have a multiple of specific interpretations (to add to the confusion). For the purposes of these lessons, maybe latency should simply be viewed as “a delay” (as compared to tying it to “bit”, “message”, or “system”). Although I only discussed latency in the context of physical system delays (not seeing any value in a bit-delay). Message delay will have significance when message overhead is added to the actual message bit(s).