The collaboration between humans and robots is one of the most disruptive and challenging research areas. Even considering advances in design and artificial intelligence, humans and robots could soon ally to perform together a number of different tasks. Robots could also became new playmates. In fact, an emerging trend is associated with the so-called phygital gaming, which builds upon the idea of merging the physical world with a virtual one in order to let physical and virtual entities, such as players, robots, animated characters and other game objects interact seamlessly as if they were all part of the same reality. This paper specifically focuses on mixed reality gaming environments that can be created by using floor projection, and tackles the issue of enabling accurate and robust tracking of off-the-shelf robots endowed with limited sensing capabilities. The proposed solution is implemented by fusing visual tracking data gathered via a fixed camera in a smart environment with odometry data obtained from robot's on-board sensors. The solution has been tested within a phygital gaming platform in a real usage scenario, by experimenting with a robotic game that exhibits many challenging situations which would be hard to manage using conventional tracking techniques.
|Titolo:||Robust robot tracking for next-generation collaborative robotics-based gaming environments|
|Data di pubblicazione:||Being printed|
|Digital Object Identifier (DOI):||10.1109/TETC.2017.2769705|
|Appare nelle tipologie:||1.1 Articolo in rivista|
File in questo prodotto: