Intuition says we measure information by looking at the length of a message. But Shannon's information theory starts with something more fundamental: how surprising is the message? Through illuminating examples, discover that entropy provides a measure of the average surprise.