Measuring Information
How can we create and transmit messages? And how much information does a message contain? Can we measure this?
Start this lesson by watching the following video from a computer science course on Information Theory hosted on Khan Academy.
The important formula to take away from this video is the following, to calculate the number of bits needed to store a message:
Where is the average number of bits we need for a message, is the number of symbols available, and is the number of digits or the length of a message. However, the average number of bits required doesn't help us when we need to decide how much storage to allocate. We should always make sure we can store any message, and not just the average-sized message. We can solve this by rounding the first part to the next larger integer number.
So, for example, a four digit PIN would require, on average bits, which is bits. Since we agreed to round to the next larger integer number, we can calculate to be on the save side. When would this worst case happen?
Consider the PIN 8899. Storing a single 8 in binary would be 1000, and therefore require 4 bits to store. Since the 9 is even bigger than 8, it would also require four bits: 1001. Any PIN with only numbers below 8 could be stored with only 3 bits per digit, but we can't plan our system like that. A bank shouldn't, that's for sure!
Last updated