The development of technologies has democratized access to information resources. However, many people who have disabilities are still not able to break through the accessibility barriers in information technologies (e.g., smartphones). One of the biggest accessibility issues of smartphones is for people with motor impairments to interact with the touchscreen.

CMA Assistant Professor Mingming Fan and his team address the accessibility issue by analyzing and designing nine eyelid gestures that people with motor impairments may use to control their mobile. This research extends the dimension of inputs from traditional mobile device interactions to processing a new vector of eyelid gestures. This innovation significantly increases the number of interactions that a person may perform on a touchscreen.

This outstanding work was recently published and highlighted in the flagship magazine of ACM, the Communications of ACM (CACM). CACM is the leading print and online publication for the computing and information technology fields. It is recognized as the most trusted and knowledgeable source of industry information for today’s computing professionals. Fan and his team open-sourced the code on Github, so researchers may build their work on top of this research results.

Eyelid Gestures for People with Motor Impairments

By Mingming Fan, Zhen Li, Franklin Mingzhe Li

Communications of the ACM, January 2022, Vol. 65 No. 1, Pages 108-115

Paper Link: 10.1145/3498367

Code on Github: https://github.com/mingming-fan/EyelidGesturesDetection