Autonomous Mobile Robot Implemented in LEGO EV3 Integrated with Raspberry Pi to Use Android-Based Vision Control Algorithms for Human-Machine Interaction
Hernando León Araujo,
Jesús Gulfo Agudelo,
Richard Crawford Vidal,
Jorge Ardila Uribe,
John Freddy Remolina,
Claudia Serpa-Imbett,
Ana Milena López,
Diego Patiño Guevara
Affiliations
Hernando León Araujo
ITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, Colombia
Jesús Gulfo Agudelo
ITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, Colombia
Richard Crawford Vidal
ITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, Colombia
Jorge Ardila Uribe
ITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, Colombia
John Freddy Remolina
ITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, Colombia
Claudia Serpa-Imbett
ITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, Colombia
Ana Milena López
ITEM/Grupo de Investigación en Tecnologías Emergentes, School of Engineering and Architecture, Universidad Pontificia Bolivariana Seccional Montería, Carrera 6 No. 97A-99, Montería 230001, Colombia
Diego Patiño Guevara
Electronics Department, Pontificia Universidad Javeriana, Carrera 7 No. 40-62 Edificio 42, Bogotá 110111, Colombia
Robotic applications, such as educational programs, are well-known. Nonetheless, there are challenges to be implemented in other settings, e.g., mine detection, agriculture support, and tasks for industry 4.0. The main challenge consists of robotic operations supported by autonomous decision using sensed-based features extraction. A prototype of a robot assembled using mechanical parts of a LEGO MINDSTORMS Robotic Kit EV3 and a Raspberry Pi controlled through servo algorithms of 2D and 2D1/2 vision approaches was implemented to tackle this challenge. This design is supported by simulations based on image, position, and a hybrid scheme for visual servo controllers. Practical implementation is operated using navigation guided by running up image-based visual servo control algorithms embedded in a Raspberry Pi that uses a control criterion based on error evolution to compute the difference between a target and sensed image. Images are collected by a camera installed on a mobile robotic platform manually and automatically operated and controlled using the Raspberry Pi. An Android application to watch the images by video streaming is shown here, using a smartphone and a video related to the implemented robot’s operation. This kind of robot might be used to complete field reactive tasks in the settings mentioned above, since the detection and control approaches allow self-contained guidance.