The HandHawk
Team: Benjamin Bodily, Christian Whitney and Jared Bronson
Project
In the next three years, the global commercial quadcopter drone market size is estimated to grow by $126 billion. As of March, 2025, there are over one million drones registered. However, despite the increase in drone usage, there are only five companies that support drone accessibility for disabled users. In addition, there are no drone controllers based on motion-control. This means a user with a disability such as missing digits cannot fly a standard remote control drone without modifications.
System
Methods
- Motion is detected by a VR camera, The Ultraleap 2. The camera scans for preprogrammed gestures until a gesture is recognized.
- Gestures have different associated actions, such as takeoff, landing, or flying a given vector.
- C++ is used to interface with the camera through USB and with the drone through Wi-Fi.
Conclusion
Results:
- The HandHawk system proves a quadcopter drone can mimic and receive commands from a motion-based controller.
Going forward:
- To increase versatility, implement the Open Computer Vision (OpenCV) library so almost any camera can be used.
- OpenCV would also allow a wider range of gestures, enabling more unique disability situations.