Spatial exploration nowadays is a highly challenging research theme that involves different domains like engineering, robotics, telecomunications, HMI and artificial intelligence. STEPS (Sistemi e Tecnologie per l'EsPlorazione Spaziale) is a research project funded by Piedmont region and led by Thales Alenia Space that aims at studying existing technologies and analyzing new solutions for spatial exploration that, with a highly growing trend, involve both human and robot presence. Due to the unknown conditions of the environment where the task is achieved, and in order to grant full functionality, robots so far are just partially autonomous; it's highly desired (and practically compulsory) to have some kind of control by the human user. Teleoperation becomes the methodology for achieving the desired form of control, since it provides solution to distance issues in controlling of devices. In this work we've focused on teleoperation guidance systems for robotic arms; we've analyzed how a robotic arm can be moved according to its joints by a human user. The desired interface should allow a human user to move a robotic arm through movements of his/her own arm. A tracking system hence is introduced, in order to detect user's movements. In particular we've analyzed the problem of moving a robotic arm by an astronaut without the usage of common control systems, as keyboard, mouse, joystick and joypad, and optic systems, that are not suitable for the scenario considered. We've studied what kind of interface and hardware could satisfy such requirements and we've analyzed how to create an intuitive and easy-to-use teleoperation guidance system. Linear accelerometer technology has been considered for the tracking requirement, and fiber optic data gloves have been used to realize discrete signal communication, through hand gestures recognition. The system developed is able to solve movements of each robotic arm, no matter how many degrees of freedom it has, since the association is realized with the end effector only. The tracking system and hand gesture recognition interfaces realized have been tested on both digital and physical systems: first digital integration tests are realized on a commercial system that provide human manikin simulation through a complex inverse kinematics library. The second integration realized is on VERITAS virtual reality environment, developed in Virtual Reality Laboratory of Thales Alenia Space company in Turin. This system currently does not include a inverse kinematics system; for this reason, we have implemented in our interface the analytical computation responsible of solving the joint positions during robotic arm movements. Our interface is finally tested on a physical system: again the solution is provided in terms of equations, where output values correspond to angles of each joint. Through these integrations we have been able to test both the numerical and analytical approaches to inverse kinematics problem.
Interazione Uomo Robot: movimentazione di un braccio robotico attraverso dispositivi aptici
TORNESE, GIANLUCA
2009/2010
Abstract
Spatial exploration nowadays is a highly challenging research theme that involves different domains like engineering, robotics, telecomunications, HMI and artificial intelligence. STEPS (Sistemi e Tecnologie per l'EsPlorazione Spaziale) is a research project funded by Piedmont region and led by Thales Alenia Space that aims at studying existing technologies and analyzing new solutions for spatial exploration that, with a highly growing trend, involve both human and robot presence. Due to the unknown conditions of the environment where the task is achieved, and in order to grant full functionality, robots so far are just partially autonomous; it's highly desired (and practically compulsory) to have some kind of control by the human user. Teleoperation becomes the methodology for achieving the desired form of control, since it provides solution to distance issues in controlling of devices. In this work we've focused on teleoperation guidance systems for robotic arms; we've analyzed how a robotic arm can be moved according to its joints by a human user. The desired interface should allow a human user to move a robotic arm through movements of his/her own arm. A tracking system hence is introduced, in order to detect user's movements. In particular we've analyzed the problem of moving a robotic arm by an astronaut without the usage of common control systems, as keyboard, mouse, joystick and joypad, and optic systems, that are not suitable for the scenario considered. We've studied what kind of interface and hardware could satisfy such requirements and we've analyzed how to create an intuitive and easy-to-use teleoperation guidance system. Linear accelerometer technology has been considered for the tracking requirement, and fiber optic data gloves have been used to realize discrete signal communication, through hand gestures recognition. The system developed is able to solve movements of each robotic arm, no matter how many degrees of freedom it has, since the association is realized with the end effector only. The tracking system and hand gesture recognition interfaces realized have been tested on both digital and physical systems: first digital integration tests are realized on a commercial system that provide human manikin simulation through a complex inverse kinematics library. The second integration realized is on VERITAS virtual reality environment, developed in Virtual Reality Laboratory of Thales Alenia Space company in Turin. This system currently does not include a inverse kinematics system; for this reason, we have implemented in our interface the analytical computation responsible of solving the joint positions during robotic arm movements. Our interface is finally tested on a physical system: again the solution is provided in terms of equations, where output values correspond to angles of each joint. Through these integrations we have been able to test both the numerical and analytical approaches to inverse kinematics problem.File | Dimensione | Formato | |
---|---|---|---|
283338_humanrobotinteraction-movingaroboticarmthroughhapticdevices(gianlucatornese).pdf
non disponibili
Tipologia:
Altro materiale allegato
Dimensione
11.38 MB
Formato
Adobe PDF
|
11.38 MB | Adobe PDF |
I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/20.500.14240/16491