Information Theory is a quantitative measure of the information# contained in message signals and allows us to determine the capacity of the communication system to transmit information# from source to destination. It deals with the mathematical modelling and analysis of a communication system#. The laws of probability are applied to Information Theory. It covers Source Coding Theorem# and Channel Coding Theorem#.
Information Theory
- TIT3131 Chapter 1: Information Sources and Sources Coding
-
Entropy
Entropy is a measure of uncertainty, to know the average information content per source symbol before observing their output. It is the basic measurement of the #information, which comes in the unit of bits per symbol or message, denoted by \(H(\phi)\). The formal mathematical definition is shown as below: