Entropy (Apr 2020)

Finite-Length Analyses for Source and Channel Coding on Markov Chains

  • Masahito Hayashi,
  • Shun Watanabe

DOI
https://doi.org/10.3390/e22040460
Journal volume & issue
Vol. 22, no. 4
p. 460

Abstract

Read online

We derive finite-length bounds for two problems with Markov chains: source coding with side-information where the source and side-information are a joint Markov chain and channel coding for channels with Markovian conditional additive noise. For this purpose, we point out two important aspects of finite-length analysis that must be argued when finite-length bounds are proposed. The first is the asymptotic tightness, and the other is the efficient computability of the bound. Then, we derive finite-length upper and lower bounds for the coding length in both settings such that their computational complexity is low. We argue the first of the above-mentioned aspects by deriving the large deviation bounds, the moderate deviation bounds, and second-order bounds for these two topics and show that these finite-length bounds achieve the asymptotic optimality in these senses. Several kinds of information measures for transition matrices are introduced for the purpose of this discussion.

Keywords