Array (Sep 2020)
Process-monitoring-for-quality — A machine learning-based modeling for rare event detection
Abstract
Process Monitoring for Quality is a Big Data-driven quality philosophy aimed at defect detection through binary classification and empirical knowledge discovery. It is founded on Big Models, a predictive modeling paradigm that applies Machine Learning, statistics and optimization techniques to process data to create a manufacturing functional model. Functional refers to a parsimonious model with high detection ability that can be trusted by engineers, and deployed to control production. A parsimonious modeling scheme is presented aimed at rare quality event detection, parsimony is induced through feature selection and model selection. Its unique ability to deal with highly/ultra-unbalanced data structures and diverse learning algorithms is validated with four case studies, using the Support Vector Machine, Logistic Regression, Naive Bayes and k-Nearest Neighbors learning algorithms. And according to experimental results, the proposed learning scheme significantly outperformed typical learning approaches based on the l1-regularized logistic regression and Random Undersampling Boosting learning algorithms, with respect to parsimony and prediction.