BWCNN: Blink to Word, a Real-Time Convolutional Neural Network Approach

Albara Ah Ramli, Rex Liu, Rahul Krishnamoorthy, Vishal I B, Xiaoxiao Wang, Ilias Tagkopoulos, and Xin Liu

2020 International Conference on Internet of Things (ICIOT 2020); Honolulu, Hawaii, USA

Download the paper - Download Dataset - Download Models - Source code on Github


Amyotrophic lateral sclerosis (ALS) is a progressive neurodegenerative disease of the brain and the spinal cord, which leads to paralysis of motor functions. Patients retain their ability to blink, which can be used for communication. Here, We present an Artificial Intelligence (AI) system that uses eye-blinks to communicate with the outside world, running on real-time Internet-of-Things (IoT) devices. The system uses a Convolutional Neural Network (CNN) to find the blinking pattern, which is defined as a series of Open and Closed states. Each pattern is mapped to a collection of words that manifest the patient's intent. To investigate the best trade-off between accuracy and latency, we investigated several Convolutional Network architectures, such as ResNet, SqueezeNet, DenseNet, and InceptionV3, and evaluated their performance. We found that the InceptionV3 architecture, after hyper-parameter fine-tuning on the specific task led to the best performance with an accuracy of 99.20% and 94ms latency. This work demonstrates how the latest advances in deep learning architectures can be adapted for clinical systems that ameliorate the patient's quality of life regardless of the point-of-care.


Results: ResNet (100 epochs)

Results: ResNet (500 epochs)

Results: ResNet Transfer Learning (t.l.) [itself] (100 epochs)

Results: ResNet Transfer Learning (t.l.) [official] (100 epochs)

Results: DenseNet (100 epochs)

Results: InceptionV3 (100 epochs)

Results: SqueezeNet (100 epochs)