Abstract
Industry 4.0 has created a growing need for effective human-robot collaboration (HRC). As robots and humans work more closely together, efficient communication becomes essential for coordinating their actions seamlessly. While speech may seem like the obvious choice for communication, noisy factory environments can render it impractical. Additionally, workers often have their hands occupied with assembly tasks, making hand-controlled interfaces less practical for controlling robots. To address these challenges, this paper presents a novel, hands-free method for robot control using electrooculography (EOG) signals–specifically, eye movements and blinks–with unmanned aerial vehicles (UAVs) used as the demonstration platform. We developed a real-time system that captures EOG signals through a graphical user interface (GUI), allowing users to direct their gaze toward on-screen commands and confirm selections with blinks. An LSTM (Long Short-Term Memory) model was trained to detect gaze coordinates and blink events from the EOG data. For performance benchmarking, we implemented a vision-based video-oculography (VOG) system as a camera-based baseline and compared both approaches under various conditions, including low light, wind disturbances, and deployment on embedded hardware as a resource-constrained device. Our comparative analysis showed the EOG-based method achieved 97.6% accuracy in command detection under normal conditions and maintained strong performance across challenging environments where the vision-based VOG system experienced significant degradation. The proposed EOG method also operated in real time at 47.97 FPS on a high-performance workstation and 38.87 FPS on the resource-constrained NVIDIA Jetson Orin Nano. On this platform, it ran 13.9x faster and used 319x less physical memory than the VOG-based alternative. These results underscore the advantages of EOG-based control in delivering reliable, real-time performance on both high-performance workstations and resource-constrained platforms, outperforming vision-based alternatives in efficiency and adaptability.
Recommended Citation
N. Zendehdel et al., "Hands-Free UAV Control: Real-Time Eye Movement Detection Using EOG And LSTM Networks," IEEE Access, Institute of Electrical and Electronics Engineers, Jan 2025.
The definitive version is available at https://doi.org/10.1109/ACCESS.2025.3578558
Department(s)
Mechanical and Aerospace Engineering
Publication Status
Open Access
Keywords and Phrases
Electrooculography (EOG); Human-Machine Interface (HMI); Long-Short Term Memory (LSTM); Machine Learning; Video-oculography (VOG)
International Standard Serial Number (ISSN)
2169-3536
Document Type
Article - Journal
Document Version
Final Version
File Type
text
Language(s)
English
Rights
© 2025 The Authors, All rights reserved.
Creative Commons Licensing

This work is licensed under a Creative Commons Attribution 4.0 License.
Publication Date
01 Jan 2025
