FLYSMART: Glove Hand Gesture Drone Control in Augmented Reality Environments
This project introduces an intuitive human-robot interaction interface through gesture detection within an Augmented Reality (AR) environment. Thus, a wearable device will allow users to control a virtual robot, specifically an unmanned aerial drone, and manipulate it through level design. We have implemented the project, Flysmart, which includes a smart wearable glove, a mobile application, as well as a cardboard headset to allow the user to control the drone via hand gestures within an immersive AR environment. A select number of users were asked to try the Flysmart system. The users showed progression as they played the game and improved with each attempt.Project Details
- Student(s): Khaled Jalloul, Hala Saadeh, Abbas Farhat, and Bernard Lichaa El Khoury
- Advisor(s): Dr. Noel Maalouf
- Year: 2022-2023