Block Code

Block Code is a message sequence that is subdivided into sequential blocks each \(k\) bits long (message bits), and each \(k\)-bit block is mapped into a \(n\)-bit block (codeword bits), where \(n > k\). The number of redundant bits added by the encoder to each transmitted block is \(n - k\) bits which is related to \(k\) message bits. Thus, we can calculate the code rate#.

Links to this page
  • TIT3131 Chapter 2: Channel Capacity and Coding
  • Systematic Code

    Systematic Code is a #Block Code in which message bits are transmitted in unaltered form. The use of Systematic Code simplifies the implementation of #decoder.

  • Minimum Distance

    The Minimum Distance of a #Block Code, denoted \(d_{\text{min}}\), is the smallest Hamming Distance# between any pair of codewords in the code.

  • Linear Block Coding

    Linear Block Coding is an #Error Control Coding using \((n, k)\) linear Block Code# where \(k\) is the number of message bits and \(n\) is the number of codeword bits. The first portion of \(k\) bits is always identical to the message sequence to be transmitted. The total number of code word could be derived by computing \(2^k\). The second portion is a \(n - k\) bits generalised parity check bits or parity bits. They are computed from the message bits using an encoding rule shown later. The basic property of Linear Block Coding is closure, where the sum of any two codes results in another code word.

  • Information Source

    When said information source (\(S\)) is a \(n\)-oder extension of itself, this means that it is \(S^n\), and it will produce Block Code# of size \(n\). The total number of block codes produced will be \(K^n\) where \(K\) is the number of individual symbols. Getting the probabilities for each block is as easy as multiplies each individual symbol’s probability. See details on Entropy# on how to calculate its uncertainty.

  • Information Rate
    \(k\) and \(n\) are defined in #Block Code where \(n > k\)
  • Error Control Coding
#math