The rise in mobility challenges due to aging populations has increased the need for innovative wheelchair technologies. Many users struggle with traditional power wheelchairs, finding them unusable and experiencing difficulties in steering. The solution, proposed by the author, is a novel side-by-side and hands-free Human-Machine Interface (HMI) for an omnidirectional electric wheelchair. It is designed to use a vision-based interface, with cameras capturing the head and torso movements of both the rider and a caregiver. Their motion intentions are interpreted and combined through shared control to produce velocity commands, enabling intuitive and collaborative navigation. This paper focuses on the hands-free HMI working principle. It analyzes how detects and processes the movements of the rider. The inputs are translated into linear and angular velocities relative to the wheelchair base, allowing navigation without using hands. Simulations validate the interface’s ability to drive the wheelchair smoothly around obstacles, aligning commands with user intentions. The system offers adjustable sensitivity to accommodate different movement thresholds to generate velocities.
Motion Tracking Hands-Free HMI of Electric Wheelchair / Baglieri, Lorenzo; Matsuura, Daisuke; Kobayashi, Tsune; Quaglia, Giuseppe. - ELETTRONICO. - 1:(2025), pp. 181-189. (Intervento presentato al convegno I4SDG Workshop 2025 - IFToMM for Sustainable Development Goals tenutosi a Villa San Giovanni nel 09/06/2025 - 12/06/2025) [10.1007/978-3-031-91151-4_20].
Motion Tracking Hands-Free HMI of Electric Wheelchair
Baglieri, Lorenzo;Quaglia, Giuseppe
2025
Abstract
The rise in mobility challenges due to aging populations has increased the need for innovative wheelchair technologies. Many users struggle with traditional power wheelchairs, finding them unusable and experiencing difficulties in steering. The solution, proposed by the author, is a novel side-by-side and hands-free Human-Machine Interface (HMI) for an omnidirectional electric wheelchair. It is designed to use a vision-based interface, with cameras capturing the head and torso movements of both the rider and a caregiver. Their motion intentions are interpreted and combined through shared control to produce velocity commands, enabling intuitive and collaborative navigation. This paper focuses on the hands-free HMI working principle. It analyzes how detects and processes the movements of the rider. The inputs are translated into linear and angular velocities relative to the wheelchair base, allowing navigation without using hands. Simulations validate the interface’s ability to drive the wheelchair smoothly around obstacles, aligning commands with user intentions. The system offers adjustable sensitivity to accommodate different movement thresholds to generate velocities.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/3000507
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo