Chapter 20 Information Theory

In this chapter we introduce the theoretical concepts behind the security of a cryptosystem. The basic question is the following: If Eve observes a piece of ciphertext, does she gain any new information about the encryption key that she did not already have? To address this issue, we need a mathematical definition of information. This involves probability and the use of a very important measure called entropy.

Many of the ideas in this chapter originated with Claude Shannon in the 1940s.

Before we start, let’s consider an example. Roll a standard six-sided die. Let A be the event that the number of dots is odd, and let B be the event that the number of dots is at least 3. If someone tells you that the roll belongs to the event AB , then you know that there are only two possibilities for what the roll is. In this sense, AB tells you more about the value of the roll than just the event A, or just the event B. In this sense, the information contained in the event AB is larger than the information just in A or just in B.

The idea of information is closely linked with the idea of uncertainty. Going back to the example of the die, if you are told that the event AB happened, you become less uncertain about what the value of the roll was than if you are simply told that event A occurred. Thus the information increased while the uncertainty decreased. Entropy provides a measure of the increase in information or the decrease in uncertainty provided by the outcome of an experiment.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset