IEEE Access (Jan 2019)

Tool Wear Predicting Based on Multisensory Raw Signals Fusion by Reshaped Time Series Convolutional Neural Network in Manufacturing

  • Zhiwen Huang,
  • Jianmin Zhu,
  • Jingtao Lei,
  • Xiaoru Li,
  • Fengqing Tian

DOI
https://doi.org/10.1109/ACCESS.2019.2958330
Journal volume & issue
Vol. 7
pp. 178640 – 178651

Abstract

Read online

Tool wear monitoring is a typical multi-sensor information fusion task. The handcrafted features may be a suboptimal choice that will lower the monitoring accuracy and require significant computational costs that hinder the real-time applications. In order to solve these problems, this paper proposed a new multisensory data-driven tool wear predicting method based on reshaped time series convolutional neural network (RTSCNN). In this method, the reshaped time series layer is introduced to represent the multisensory raw signals, the alternately convolutional and pooling layers is employed to adaptively learn distinctive characteristics of tool wear directly from multisensory raw signals while the multi-layer perceptron with regression layer performs automatic tool wear prediction. In addition, three tool run-to-failure datasets measured from three-flute ball nose tungsten carbide cutter of high-speed CNC machine under milling operations are used to experimentally demonstrate the performance of the proposed RTSCNN-based multisensory data-driven tool wear predicting method. The experimental results show that the prediction error of the RTSCNN-based data-driven method is observably lower than other state-of-art methods.

Keywords