Department of Signal Theory and Communications and Telematic Systems and Computation, Universidad Rey Juan Carlos, Campus Fuenlabrada, Camino del Molino, 5, Fuenlabrada, Madrid 28942, Spain; Corresponding author.
Juan Diego Peña Narváez
Department of Signal Theory and Communications and Telematic Systems and Computation, Universidad Rey Juan Carlos, Campus Fuenlabrada, Camino del Molino, 5, Fuenlabrada, Madrid 28942, Spain
José Miguel Guerrero Hernández
Department of Signal Theory and Communications and Telematic Systems and Computation, Universidad Rey Juan Carlos, Campus Fuenlabrada, Camino del Molino, 5, Fuenlabrada, Madrid 28942, Spain
Rodrigo Pérez Rodríguez
Department of Signal Theory and Communications and Telematic Systems and Computation, Universidad Rey Juan Carlos, Campus Fuenlabrada, Camino del Molino, 5, Fuenlabrada, Madrid 28942, Spain
Alejandro González Cantón
Department of Mechanical, Computer Science and Aerospace Engineering, Universidad de León, Campus Vegazana, s/n, León 24007, Spain
Francisco Javier Rodríguez Lera
Department of Mechanical, Computer Science and Aerospace Engineering, Universidad de León, Campus Vegazana, s/n, León 24007, Spain
This dataset was collected during the 2023 and 2024 RoboCup competitions using the TIAGo robot equipped with an RGB-D camera, a Hokuyo laser, and a RODE microphone. The dataset includes ROSbag files that capture the robot's sensory data and planning task behavior, as well as video recordings that provide third-person perspectives of task execution. These data provide information about the performance of autonomous robots in social tasks and navigation in dynamic environments with human interaction.