Abstract
In the growing Industry 4.0 market, there is strong need to implement automatic inspection methods to support manufacturing processes. Tool wear in turning is one of the biggest concerns that most expert operators are able to indirectly infer through the analysis of the removed chips. Automatising this operation would enable developing more efficient cutting processes that turns in easier process planning management toward the Zero Defect Manufacturing paradigm. This paper presents a deep learning approach, based on image processing applied to turning chips for indirectly identifying tool wear levels. The procedure extracts different indicators from the RGB and HSV image channels and instructs a neural network for classifying the chips, based on tool state conditions. Images were collected with a high-resolution digital camera during an experimental cutting campaign involving tool wear analysis with direct microscope imaging. The sensitivity analysis confirmed that the most sensible image channels are the hue value H that were used to teach the network, leading to performances in the range of 95 of proper classification. The feasibility of the deep learning approach for indirectly understanding the tool wear from the chip colour characterisation is confirmed. However, due to the big effects on chip colours of variables as the workpiece material and cutting process parameters, the applicability is limited to stable production flows. An industrial implementation can be foreseen by populating proper large databases and by implementing real-time chip segmentation analysis.
Original language | English |
---|---|
Pages (from-to) | 1099-1114 |
Number of pages | 16 |
Journal | International Journal of Advanced Manufacturing Technology |
Volume | 111 |
Issue number | 3-4 |
Early online date | 7 Oct 2020 |
DOIs | |
Publication status | Published - 1 Nov 2020 |