Advances in Protograph-Based LDPC Codes and a Rate Allocation Problem
This dissertation consists of three parts. The first part focuses on a class of modern channel codes known as protograph-based low-density parity-check (LDPC) codes. Also known as protograph LDPC codes, these powerful error-correcting codes have enabled communication systems of the past fifteen years to achieve very high throughputs. The first part of the dissertation presents a new design method based on an upper bound on minimum distance to obtain rate-compatible, protograph quasi-cyclic (QC) LDPC codes called Protograph-based Raptor-like LDPC codes (PBRL codes). A major contribution here is a very-low-complexity PBRL design algorithm that is provably efficient.
The second part of the dissertation continues the focus on protograph LDPC codes, first exploring how the decoding complexity of PBRL codes can be reduced and whether the extending structure that provides rate-compatibility to a PBRL code is optimal or not. Then, this part considers the problem of design of PBRL codes for any increment ordering. The degree-1 extending structure yields naturally to the design of PBRL codes that decode efficiently even when increments arrive out-of-order. This part finally considers the following question: What is the shortest block-length required to obtain a protograph QC-LDPC code with a girth of at least 6 or 8 from a (3, L) complete protograph? An affirmative answer is given for girth of at least 6 and directions are explored for girth of at least 8.
Finally, the dissertation turns to communication theory and tackles a rate allocation problem previously studied in literature, but with an important twist. Consider a cross-layer coding scheme with packet-level erasure coding and physical-layer channel coding. It is known from previous work that some erasure coding is necessary even in the limit of large physical-layer codeword block-lengths if the physical-layer fading channel does not provide diversity that grows with block-length. However, is erasure coding still required in the limit of large block-lengths if the physical layer allows for diversity to grow with block-length? The theoretical answer turns out to be a resounding “no” in the case of Rayleigh fading that allows diversity to increase linearly with block-length.