Gesture Controlled AMR - Team 35

The goal of this capstone project is to advance the GyroPalm VIMPAACT [1], a framework that integrates GyroPalm Spectrum [2] with GyroPalm Omnibot V2 [3]. This combines the gesture controlled wearable, AR HUD with Vuzix Blade [4], with an autonomous mobile robot. Overall this project should demonstrate how professionals can control, monitor, and visualize robotic systems through gestures and AR overlays. GyroPalm VIMPAACT is state-of-the-art technology and has a wide variety of existing use cases.  This project is necessary because traditional teleoperation methods are not intuitive and often require joysticks, keyboards, and/or complex UIs making it difficult for users to use the products. There is no other product like it. GyroPalm VIMPAACT is designed to be hands free and user friendly. Improving efficiency, increasing safety and reducing training time. There are some adjacent technologies but there are no other products that directly compete with GyroPalm. Smartwatches, AR glasses, and VR robotics controllers exist but no product combines all three technologies and can be used in industrial IoT settings. The GyroPalm solution will have a great impact for warehousing, logistics, automotive, hospitality, and university systems. The GyroPalm technology accelerated automation adoption, bridges the game between workers and autonomous systems, and demonstrates practical use cases for AR beyond entertainment and consumer markets.

Presentation Video Link
Team Photo
Photo of Team 35
Team Poster
Team Contact
Problem Statement/Summary

GyroPalm VIMPAACT technology needs to be fine-tuned in use with Spectrum technology to navigate more efficiently in obstacle-dense environments. The completion of this goal will allow to more easily use GyroPalm’s technology together with the Vuzix Blade and Spectrum technologies. At the end of the project, GyroPalm wearable has to be integrated with myCobot arm and additionally the Omnibot V2 will be autonomously navigated in the Lambertus fourth floor.