There is an assessment question in the lesson plan that I don’t understand:

_Develop a protocol that allows the user to send a calendar date (mm/dd). What is the minimum number of bits necessary?_The answer is listed as 9 bits.

How?

There is an assessment question in the lesson plan that I don’t understand:

_Develop a protocol that allows the user to send a calendar date (mm/dd). What is the minimum number of bits necessary?_The answer is listed as 9 bits.

How?

Hello Kadkins,

The highest number we need to send to communicate months would be 12 (December). This could be represented like so in binary: 1100

The highest day we might need to send would be 31. In binary that could be: 11111

The protocol could state that the **first four bits** indicate the **month** and the **last five bits** indicate the **day**.

We need a total of at least nine bits in order to communicate the biggest possibility (12/31).

12/31: 110011111

Other examples:

1/5: 000100101 (0001 = 1, 00101 = 5)

5/25: 010111001 (0101 = 5, 11001 = 25)

Hope this helps!

~ Hannah

I am teaching this today and figured I could discuss with my students two main methods.

One of which is above - Four bits for Month + Five bits for day

Alternatively if you look at how many days in a year 365 you could figure it out using 9 bits - **101101101**

I believe the question is trying to lead them to protocols, and therefore sending the month/day in chunks, but either method can answer the minimum number of bits portion of the question

Turns out my method (above) does not work.

Using the example of a 75x75 grid - The x and y would both take 7 bits = Total of 14 bits.

However 75x75 = 5625 which would only take a total of 13 bits.

Can anyone explain this. Would technically 13 or 14 be correct?

I’m linking here for posterity’s sake the continued discussion we had surrounding @glenn.crane’s example in another thread. It’s an interesting read that I hope others can benefit from!