Advances in Mechanical Engineering (Dec 2019)

Exploiting visual cues for safe and flexible cyber-physical production systems

  • Syed Osama Bin Islam,
  • Waqas Akbar Lughmani,
  • Waqar Shahid Qureshi,
  • Azfar Khalid,
  • Miguel Angel Mariscal,
  • Susana Garcia-Herrero

DOI
https://doi.org/10.1177/1687814019897228
Journal volume & issue
Vol. 11

Abstract

Read online

Human workers are envisioned to work alongside robots and other intelligent factory modules, and fulfill supervision tasks in future smart factories. Technological developments, during the last few years, in the field of smart factory automation have introduced the concept of cyber-physical systems, which further expanded to cyber-physical production systems. In this context, the role of collaborative robots is significant and depends largely on the advanced capabilities of collision detection, impedance control, and learning new tasks based on artificial intelligence. The system components, collaborative robots, and humans need to communicate for collective decision-making. This requires processing of shared information keeping in consideration the available knowledge, reasoning, and flexible systems that are resilient to the real-time dynamic changes on the industry floor as well as within the communication and computer network infrastructure. This article presents an ontology-based approach to solve industrial scenarios for safety applications in cyber-physical production systems. A case study of an industrial scenario is presented to validate the approach in which visual cues are used to detect and react to dynamic changes in real time. Multiple scenarios are tested for simultaneous detection and prioritization to enhance the learning surface of the intelligent production system with the goal to automate safety-based decisions.