Entropy (Aug 2024)

Optimizing Distributions for Associated Entropic Vectors via Generative Convolutional Neural Networks

  • Shuhao Zhang,
  • Nan Liu,
  • Wei Kang,
  • Haim Permuter

DOI
https://doi.org/10.3390/e26080711
Journal volume & issue
Vol. 26, no. 8
p. 711

Abstract

Read online

The complete characterization of the almost-entropic region yields rate regions for network coding problems. However, this characterization is difficult and open. In this paper, we propose a novel algorithm to determine whether an arbitrary vector in the entropy space is entropic or not, by parameterizing and generating probability mass functions by neural networks. Given a target vector, the algorithm minimizes the normalized distance between the target vector and the generated entropic vector by training the neural network. The algorithm reveals the entropic nature of the target vector, and obtains the underlying distribution, accordingly. The proposed algorithm was further implemented with convolutional neural networks, which naturally fit the structure of joint probability mass functions, and accelerate the algorithm with GPUs. Empirical results demonstrate improved normalized distances and convergence performances compared with prior works. We also conducted optimizations of the Ingleton score and Ingleton violation index, where a new lower bound of the Ingleton violation index was obtained. An inner bound of the almost-entropic region with four random variables was constructed with the proposed method, presenting the current best inner bound measured by the volume ratio. The potential of a computer-aided approach to construct achievable schemes for network coding problems using the proposed method is discussed.

Keywords