Real-Time Human-Computer Interaction using Eye Gazes

Abstract

Eye Gaze Emerges as a Unique Channel in Human–computer Interaction (HCI) that Recognizes Human Intention based on Gaze Behavior and Enables Contactless Access to Control and Operate Software Interfaces on Computers. in This Paper, We Propose a Real-Time HCI System using Eye Gaze. First, We Capture and Track Eyes using the Dlib 68-Point Landmark Detector, and Design an Eye Gaze Recognition Model to Recognize Four Types of Eye Gazes. Then, We Construct an Instance Segmentation Model to Recognize and Segment Tools and Parts using the Mask Region-Based Convolutional Neural Network (R-CNN) Method. after that, We Design an HCI Software Interface by Integrating and Visualizing the Proposed Eye Gaze Recognition and Instance Segmentation Models. the HCI System Captures, Tracks, and Recognizes the Eye Gaze through a Red–green–blue (RGB) Webcam, and Provides Responses based on the Detected Eye Gaze, Including the Tool and Part Segmentation, Object Selection and Interface Switching. Experimental Results Show that the Proposed Eye Gaze Recognition Method Achieves an Accuracy of >99 % in a Recommended Distance between the Eyes and the Webcam, and the Instance Segmentation Model Achieves an Accuracy of 99 %. the Experimental Results of the HCI System Operation Demonstrate the Feasibility and Robustness of the Proposed Real-Time HCI System.

Department(s)

Mechanical and Aerospace Engineering

Second Department

Computer Science

Comments

National Science Foundation, Grant CMMI-1646162

Keywords and Phrases

Eye gaze recognition; Human-computer interaction; Instance segmentation; Mask R-CNN

International Standard Serial Number (ISSN)

2213-8463

Document Type

Article - Journal

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2023 Elsevier, All rights reserved.

Publication Date

01 Aug 2023

Share

 
COinS