Designing Encodings

Main Menu           Previous Topic                                                           Next Topic


Transmission over a Noiseless Channel

You may have noticed a similarity between the expressions for channel capacity and for entropy. In fact, the expressions correspond perfectly when all possible messages have equal probability. In this case, the entropy of an information source that produces these inputs with equal probability is equal to the capacity of the channel that carries them.

This relationship suggests that the way to transmit an information source across a channel is to first encode it into the channel alphabet so that all messages of long length are equally likely and then transmit. In fact, this idea works remarkably well and forms the basis of Shannon's famous channel coding theorem.

This theorem states that if an information source produces an entropy of H bits of information per second, it can be transmitted over a channel of capacity C bits per second as long as H ≤ C, with arbitrarily low probability of error.

In a sense, the theorem suggests a perfectly efficient matching of the source to the channel through proper encoding of the source's signal. Consider the telegraph example. The 26 English letters are encoded as a series of dots and dashes. The most frequently used letter, E, is given the shortest encoding, a dot. Some common words are even given their own special encoding. Shannon's proof takes this idea to its logical conclusion, assigning high probability messages to short signals so that all messages of long length are equally likely.

                   Previous Slide                                                           Next Slide