IEEE Access (Jan 2019)

Simultaneous Tensor Completion and Denoising by Noise Inequality Constrained Convex Optimization

  • Tatsuya Yokota,
  • Hidekata Hontani

DOI
https://doi.org/10.1109/ACCESS.2019.2894622
Journal volume & issue
Vol. 7
pp. 15669 – 15682

Abstract

Read online

Convex optimization, rather than a non-convex approach, still play important roles in many computer science applications because of its exactness and efficiency. In this paper, we consider a tensor completion problem with noise based on the convex optimization. When we assume noisy entries, the optimization problem is usually considered as the “regularization” problem, which simultaneously minimizes penalty and error terms with some tradeoff parameters. However, the good value of the tradeoff is not easily determined because of the differences between the two units and the dependency on data. From the perspective of trade-off tuning, the noisy tensor completion problem with the “noise inequality constraint” is preferred than the “regularization,” because the good noise threshold can be easily bound with noise standard deviation. In this paper, we attempt to solve convex tensor completion problems by using two types of noise inequality constraints: Gaussian and Laplace distributions. To solve the inequality constrained convex optimization in a direct way, we derived the proximal mappings for noise inequalities that are analytically computable with low computational complexity. The optimization algorithm is developed based on the primal-dual splitting framework, and a new step-size adaptation method is proposed to accelerate the optimization. The extensive experiments are conducted to demonstrate the advantages of the proposed method for retrieval of visual data, such as color images, movies, and 3D-volumetric data.

Keywords