This thesis presents the design, implementation, and validation of a robotic system that integrates an augmented reality (AR) interface with a collaborative manipulator to support creative and assistive tasks for individuals with motor impairments. The proposed framework leverages the Franka Emika FR3 robotic arm, controlled through velocity-based strategies refined with filtering and null-space optimization to ensure stability, accuracy, and ergonomic configurations. A hybrid control stage was first developed to simulate AR pose inputs for safe preliminary validation, followed by the integration of a real AR headset enabling users to guide the robot’s end-effector through natural head movements. Experimental trials assessed both live AR-guided execution and deferred reproduction of pre-planned trajectories. The system achieved sub-5 mm positional accuracy and sub-degree orientation accuracy across conditions. Importantly, patient trials demonstrated that paraplegic participants were able to design figures in the AR environment and reproduce them physically with the robot, confirming both the technical feasibility and the rehabilitative potential of the approach. The results position this work within the growing field of assistive robotics, showing that AR-based human–robot interaction can effectively enhance expressiveness, autonomy, and accessibility. Future research directions include extending the framework beyond drawing toward daily living assistance tasks, thereby broadening the impact of collaborative robots in healthcare and personal support.
This thesis presents the design, implementation, and validation of a robotic system that integrates an augmented reality (AR) interface with a collaborative manipulator to support creative and assistive tasks for individuals with motor impairments. The proposed framework leverages the Franka Emika FR3 robotic arm, controlled through velocity-based strategies refined with filtering and null-space optimization to ensure stability, accuracy, and ergonomic configurations. A hybrid control stage was first developed to simulate AR pose inputs for safe preliminary validation, followed by the integration of a real AR headset enabling users to guide the robot’s end-effector through natural head movements. Experimental trials assessed both live AR-guided execution and deferred reproduction of pre-planned trajectories. The system achieved sub-5 mm positional accuracy and sub-degree orientation accuracy across conditions. Importantly, patient trials demonstrated that paraplegic participants were able to design figures in the AR environment and reproduce them physically with the robot, confirming both the technical feasibility and the rehabilitative potential of the approach. The results position this work within the growing field of assistive robotics, showing that AR-based human–robot interaction can effectively enhance expressiveness, autonomy, and accessibility. Future research directions include extending the framework beyond drawing toward daily living assistance tasks, thereby broadening the impact of collaborative robots in healthcare and personal support.
Control of a Redundant Manipulator for Human- Guided Assistive Drawing Tasks
BIZZARO, LEONARDO
2024/2025
Abstract
This thesis presents the design, implementation, and validation of a robotic system that integrates an augmented reality (AR) interface with a collaborative manipulator to support creative and assistive tasks for individuals with motor impairments. The proposed framework leverages the Franka Emika FR3 robotic arm, controlled through velocity-based strategies refined with filtering and null-space optimization to ensure stability, accuracy, and ergonomic configurations. A hybrid control stage was first developed to simulate AR pose inputs for safe preliminary validation, followed by the integration of a real AR headset enabling users to guide the robot’s end-effector through natural head movements. Experimental trials assessed both live AR-guided execution and deferred reproduction of pre-planned trajectories. The system achieved sub-5 mm positional accuracy and sub-degree orientation accuracy across conditions. Importantly, patient trials demonstrated that paraplegic participants were able to design figures in the AR environment and reproduce them physically with the robot, confirming both the technical feasibility and the rehabilitative potential of the approach. The results position this work within the growing field of assistive robotics, showing that AR-based human–robot interaction can effectively enhance expressiveness, autonomy, and accessibility. Future research directions include extending the framework beyond drawing toward daily living assistance tasks, thereby broadening the impact of collaborative robots in healthcare and personal support.| File | Dimensione | Formato | |
|---|---|---|---|
|
Bizzaro_Leonardo.pdf
Accesso riservato
Dimensione
7.51 MB
Formato
Adobe PDF
|
7.51 MB | Adobe PDF |
The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License
https://hdl.handle.net/20.500.12608/98553