EURO Journal on Computational Optimization (Sep 2015)

Global resolution of the support vector machine regression parameters selection problem with LPCC

  • Yu-Ching Lee,
  • Jong-Shi Pang,
  • JohnE. Mitchell

Journal volume & issue
Vol. 3, no. 3
pp. 197 – 261

Abstract

Read online

Support vector machine regression is a robust data fitting method to minimize the sum of deducted residuals of regression, and thus is less sensitive to changes of data near the regression hyperplane. Two design parameters, the insensitive tube size (εe) and the weight assigned to the regression error trading off the normed support vector (Ce), are selected by user to gain better forecasts. The global training and validation parameter selection procedure for the support vector machine regression can be formulated as a bi-level optimization model, which is equivalently reformulated as linear program with linear complementarity constraints (LPCC). We propose a rectangle search global optimization algorithm to solve this LPCC. The algorithm exhausts the invariancy regions on the parameter plane ((Ce,εe)-plane) without explicitly identifying the edges of the regions. This algorithm is tested on synthetic and real-world support vector machine regression problems with up to hundreds of data points, and the efficiency are compared with several approaches. The obtained global optimal parameter is an important benchmark for every other selection of parameters.

Keywords