There are many situations in which a user may need (or want) to enter text on a mobile device without looking at the device. This could be situational (the user is walking to class) or the user could have a visual impairment that prevents them from being able to see or read the screen. Most of my graduate research to date has focused primarily on this topic, but there are several entry methods that I have not had the chance to review and compare to the ones that I have read about or developed. I hope to use this paper as a chance to more thoroughly explore the topic and perhaps generate some new ideas for future work.
Enhancing the Composition Task in Text Entry Studies: Eliciting Difficult Text and Improving Error Rate Calculation
Dylan Gaines, Per Ola Kristensson, and Keith Vertanen
CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, To Appear.
DOI Not Yet Active PDF
Methods for Evaluating Depth Perception in a Large-Screen Immersive Display
Dylan Gaines and Scott Kuhl
SUI '20: Symposium on Spatial User Interaction, 2020.
DOI PDF
VelociWatch: Designing and Evaluating a Virtual Keyboard for the Input of Challenging Text
Keith Vertanen, Dylan Gaines, Crystal Fletcher, Alex M. Stanage, Robbie Watling, Per Ola Kristensson
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 2019.
DOI PDF
Exploring an Ambiguous Technique for Eyes-Free Mobile Text Entry
Dylan Gaines
ASSETS '18: Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility, 2018.
DOI PDF
The Impact of Word, Multiple Word, and Sentence Input on Virtual Keyboard Decoding Performance
Keith Vertanen, Crystal Fletcher, Dylan Gaines, Jacob Gould, Per Ola Kristensson
CHI '18: Proceedings of the ACM Conference on Human Factors in Computing Systems, 2018.
DOI PDF