Applied Sciences (Apr 2024)
Efficient Optimization of a Support Vector Regression Model with Natural Logarithm of the Hyperbolic Cosine Loss Function for Broader Noise Distribution
Abstract
While traditional support vector regression (SVR) models rely on loss functions tailored to specific noise distributions, this research explores an alternative approach: ε-ln SVR, which uses a loss function based on the natural logarithm of the hyperbolic cosine function (lncosh). This function exhibits optimality for a broader family of noise distributions known as power-raised hyperbolic secants (PHSs). We derive the dual formulation of the ε-ln SVR model, which reveals a nonsmooth, nonlinear convex optimization problem. To efficiently overcome these complexities, we propose a novel sequential minimal optimization (SMO)-like algorithm with an innovative working set selection (WSS) procedure. This procedure exploits second-order (SO)-like information by minimizing an upper bound on the second-order Taylor polynomial approximation of consecutive loss function values. Experimental results on benchmark datasets demonstrate the effectiveness of both the ε-ln SVR model with its lncosh loss and the proposed SMO-like algorithm with its computationally efficient WSS procedure. This study provides a promising tool for scenarios with different noise distributions, extending beyond the commonly assumed Gaussian to the broader PHS family.
Keywords