Atti della Accademia Peloritana dei Pericolanti : Classe di Scienze Fisiche, Matematiche e Naturali (Jan 2024)

Karush-Kuhn-Tucker conditions and Lagrangian approach for improving machine learning techniques: A survey and new developments

  • Tiziana Ciano,
  • Massimiliano Ferrara

DOI
https://doi.org/10.1478/AAPP.1021A1
Journal volume & issue
Vol. 102, no. 1
p. A1

Abstract

Read online

In this work we propose new proofs of some classical results of nonlinear programming milestones, in particular for the Kuhn-Tucker conditions and Lagrangian methods and functions. This study is concerned with some interesting features found in the well-known tools and methods, connecting them with a technical analysis of the “Maximal Margin Classifier” designed specifically for linearly separable data, while referring to the condition in which data can be separated linearly by using a hyperplane. In this context of analysis, we technically point out the centrality played by these mathematical tools when obtaining robustness in Machine Learning procedures analyzing some support vector machine (SVM) models, as they are used in various contexts and applications (e.g., Soft Margin SVM and Maximum Margin SVM). This paper represents the first study reinforcing the ongoing Machine Learning Modeling and the research project we will launch in the near future on this fascinating frame of analysis. In this work we examine the problem of estimating the bias into a decision-making process. A new decision function algorithm is introduced as well.