Sensors (May 2018)

A Kinect-Based Segmentation of Touching-Pigs for Real-Time Monitoring

  • Miso Ju,
  • Younchang Choi,
  • Jihyun Seo,
  • Jaewon Sa,
  • Sungju Lee,
  • Yongwha Chung,
  • Daihee Park

DOI
https://doi.org/10.3390/s18061746
Journal volume & issue
Vol. 18, no. 6
p. 1746

Abstract

Read online

Segmenting touching-pigs in real-time is an important issue for surveillance cameras intended for the 24-h tracking of individual pigs. However, methods to do so have not yet been reported. We particularly focus on the segmentation of touching-pigs in a crowded pig room with low-contrast images obtained using a Kinect depth sensor. We reduce the execution time by combining object detection techniques based on a convolutional neural network (CNN) with image processing techniques instead of applying time-consuming operations, such as optimization-based segmentation. We first apply the fastest CNN-based object detection technique (i.e., You Only Look Once, YOLO) to solve the separation problem for touching-pigs. If the quality of the YOLO output is not satisfied, then we try to find the possible boundary line between the touching-pigs by analyzing the shape. Our experimental results show that this method is effective to separate touching-pigs in terms of both accuracy (i.e., 91.96%) and execution time (i.e., real-time execution), even with low-contrast images obtained using a Kinect depth sensor.

Keywords