RoboTrio2: Annotated Interactions of a Teleoperated Robot and Human Dyads for Data-Driven Behavioral Models
Abstract
We present here RoboTrio2, an annotated multimodal corpus of interactions between an autonomous-looking social robot and two humans, and the original way to record it: an immersive tele-operation of therobot, which makes it behave naturally and efficiently, and captures many signals (gaze including vergence,head and neck movements, the exact subjective stereo views that motivate the decisions in the interaction,binaural audio...). With this high level of embodiment, the pilot provides the robot with demonstrations ofconversational skills to conduct a natural interaction with humans and successfully perform the intendedtask (social interactions in a gaming scenario, with gaze and speech turnovers). The behaviors of its twohuman partners are also recorded through static HD cameras and headset microphones to ease annotation.Training autonomous behavioral models for our social robot is the main goal of this 8-hour corpus, but thestudy of elicited human behaviors is also possible with the corpus and annotations we released
Origin | Publisher files allowed on an open archive |
---|