Visual programming languages have facilitated the application development process, improving our ability to express programs, as well as our ability to view, edit and interact with them. Yet even in programming environments, productivity is restricted by the primary input sources: the mouse and the keyboard. As an alternative, we investigate a program development interface which responds to the most natural human communication technologies: voice, handwriting and gesture. Speech- and pen-based systems have yet to find broad acceptance in everyday life because they are insufficiently advantageous to overcome problems with reliability. However, we believe that a visual programming environment with a multimodal user interface properly constrained so as not to exceed the limits of the current technology has the potential to increase programming productivity for not only those people who are manually or visually impaired, but for the general population as well. In this paper we report on such a system.
J. Leopold and A. Ambler, "Keyboardless Visual Programming Using Voice, Handwriting, and Gesture," Proceedings of the IEEE Symposium on Visual Languages, 1997, Institute of Electrical and Electronics Engineers (IEEE), Jan 1997.
The definitive version is available at http://dx.doi.org/10.1109/VL.1997.626555
IEEE Symposium on Visual Languages, 1997
Keywords and Phrases
Application Development; Gesture Recognition; Handicapped Aids; Handwriting Recognition; Human Communication; Human Resource Management; Keyboardless Visual Programming; Manually Impaired; Multimodal User Interface; Pen-Based Systems; Productivity; Program Development Interface; Program Editing; Programming Environments; Reliability; Software Tools; Speech Recognition; Speech-Based Systems; User Interfaces; Visual Languages; Visual Programming; Visual Programming Languages; Visually Impaired; Voice Recognition
Article - Conference proceedings
© 1997 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.