You have two 3-bit temperature sensors (A && B) that measure the same thing. Both sensors are hooked up to the same CPU, which takes in the sensor readings. You know that the sensors are designed so that their readings can be off by at most one bit. We claim that if B knows that A has sent the CPU a 3-bit sequence, then B only needs to send 2 bits, and the CPU will be able to reconstruct B's 3-bit measurement, thereby conserving bandwidth. How is this possible?
Enjoy your weekend!
Jared - 7 years, 11 months ago
If you're claiming that the delta between sensor A and B is at most 1 bit (ie, +-1 from each other), we can say that there is only 3 cases:
So we need 2 bits to represent these 3 states.
Bit 1: Error bit. If set measurements are not the same. Bit 2: Direction bit. If bit 1 is set, this tells us if B is greater or less than A.
00: Measurements Equal 10: Measurements Equal 01: Measurements not equal, B is less than A 11: Measurements not equal, B is greater than A
A sends 101 B measures 100, 1 bit lower B sends 01 signifying there is an error and B is lower.
A sends 101 B measures 101 B sends 00
A sends 101 B measures 110 B sends 11