This thesis presents a vision-based system for controlling a robotic arm through hand gestures, enabling intuitive and contactless human-robot interaction. The proposed system captures hand images using a standard RGB camera, processes them to detect hand landmarks, and classifies gestures using computer vision techniques. These recognized gestures are then mapped to control commands for the robotic arm. The research explores key topics such as 3D transformations, gesture recognition methods, and robotic kinematics, integrating state-of-the-art frameworks like MediaPipe and ROS. Experimental results demonstrate the system's effectiveness in real-time control scenarios, highlighting its potential for applications in assistive robotics, industrial automation, and sterile environments.

This thesis presents a vision-based system for controlling a robotic arm through hand gestures, enabling intuitive and contactless human-robot interaction. The proposed system captures hand images using a standard RGB camera, processes them to detect hand landmarks, and classifies gestures using computer vision techniques. These recognized gestures are then mapped to control commands for the robotic arm. The research explores key topics such as 3D transformations, gesture recognition methods, and robotic kinematics, integrating state-of-the-art frameworks like MediaPipe and ROS. Experimental results demonstrate the system's effectiveness in real-time control scenarios, highlighting its potential for applications in assistive robotics, industrial automation, and sterile environments.

Control of a robotic arm through hand gestures using computer vision

FERIN, ELI
2024/2025

Abstract

This thesis presents a vision-based system for controlling a robotic arm through hand gestures, enabling intuitive and contactless human-robot interaction. The proposed system captures hand images using a standard RGB camera, processes them to detect hand landmarks, and classifies gestures using computer vision techniques. These recognized gestures are then mapped to control commands for the robotic arm. The research explores key topics such as 3D transformations, gesture recognition methods, and robotic kinematics, integrating state-of-the-art frameworks like MediaPipe and ROS. Experimental results demonstrate the system's effectiveness in real-time control scenarios, highlighting its potential for applications in assistive robotics, industrial automation, and sterile environments.
2024
Control of a robotic arm through hand gestures using computer vision
This thesis presents a vision-based system for controlling a robotic arm through hand gestures, enabling intuitive and contactless human-robot interaction. The proposed system captures hand images using a standard RGB camera, processes them to detect hand landmarks, and classifies gestures using computer vision techniques. These recognized gestures are then mapped to control commands for the robotic arm. The research explores key topics such as 3D transformations, gesture recognition methods, and robotic kinematics, integrating state-of-the-art frameworks like MediaPipe and ROS. Experimental results demonstrate the system's effectiveness in real-time control scenarios, highlighting its potential for applications in assistive robotics, industrial automation, and sterile environments.
Robotic arm
Computer vision
Hand gestures
File in questo prodotto:
File Dimensione Formato  
Ferin_Eli.pdf

accesso aperto

Dimensione 21.06 MB
Formato Adobe PDF
21.06 MB Adobe PDF Visualizza/Apri

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/84374