This thesis covers the study, the definition, and the setup of systems to track the human gesture, and the realization of a device to assist visually impaired people during the exploration of known or unknown environments. It has been started at Institute for Psychoacoustics and Electronic Music (IPEM) at the University of Gent, where a particular research area is dedicated to the study of the role of the human body in relation with all musical activities. IPEM researchers have several type of sensors available to use for this purpose: one of them is a Motion Capture system (MoCap), installed at IPEM laboratory. MoCap is a very expensive and cumbersome system to capture peoples' motions, on the other hand it is very precise and accurate in gathering data. The release of Microsoft Kinect sensor has caught researchers attention as a possible substitute to MoCap being a cheaper and more portable device for all the experiments where costs and installation spaces are problematic. Kinect sensor is clearly not accurate as MoCap system, in fact this thesis investigates the differences and the errors that the device from Microsoft does to estimate positions and movements of a person, taking as reference system the trustworthy IPEM's MoCap. At the end of the evaluation stage of the device, the thesis is continued at Centro di Sonologia Computazionale (CSC) at the University of Padova, where an application has been developed in order to help visually impaired people. The system - called SoundingARM - is usually installed in the middle of a room, in combination with Kinect sensor which is turned towards the entrance door, ready to recognize every incoming people. As soon as a person goes in, he can recognize obstacles or a piece of furniture just moving his arms: SoundingARM is able to identify the movement that a person naturally does to point an object and it is able to report the name of pointed object by text-to-speech synthesizer. So by this device visually impaired users are able to reconstruct themselves a mental map of the room, without having to go throw the door. In conclusion this work contains wide documentation about obtained results during the experiments for the evaluation of Kinect device and a description of development and implementation stages of SoundingARM application

A comparison between gesture tracking models, and the development of an interactive mobility aid system for the visually impaired

Scattolin, Nicola
2012/2013

Abstract

This thesis covers the study, the definition, and the setup of systems to track the human gesture, and the realization of a device to assist visually impaired people during the exploration of known or unknown environments. It has been started at Institute for Psychoacoustics and Electronic Music (IPEM) at the University of Gent, where a particular research area is dedicated to the study of the role of the human body in relation with all musical activities. IPEM researchers have several type of sensors available to use for this purpose: one of them is a Motion Capture system (MoCap), installed at IPEM laboratory. MoCap is a very expensive and cumbersome system to capture peoples' motions, on the other hand it is very precise and accurate in gathering data. The release of Microsoft Kinect sensor has caught researchers attention as a possible substitute to MoCap being a cheaper and more portable device for all the experiments where costs and installation spaces are problematic. Kinect sensor is clearly not accurate as MoCap system, in fact this thesis investigates the differences and the errors that the device from Microsoft does to estimate positions and movements of a person, taking as reference system the trustworthy IPEM's MoCap. At the end of the evaluation stage of the device, the thesis is continued at Centro di Sonologia Computazionale (CSC) at the University of Padova, where an application has been developed in order to help visually impaired people. The system - called SoundingARM - is usually installed in the middle of a room, in combination with Kinect sensor which is turned towards the entrance door, ready to recognize every incoming people. As soon as a person goes in, he can recognize obstacles or a piece of furniture just moving his arms: SoundingARM is able to identify the movement that a person naturally does to point an object and it is able to report the name of pointed object by text-to-speech synthesizer. So by this device visually impaired users are able to reconstruct themselves a mental map of the room, without having to go throw the door. In conclusion this work contains wide documentation about obtained results during the experiments for the evaluation of Kinect device and a description of development and implementation stages of SoundingARM application
2012-07-16
96
Motion Capture, MoCap, Kinect, SoundingARM, Visually impaired, Auditory visuals
File in questo prodotto:
File Dimensione Formato  
Tesi.Nicola.Scattolin.pdf

accesso aperto

Dimensione 1.87 MB
Formato Adobe PDF
1.87 MB Adobe PDF Visualizza/Apri

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/15832