Developing A Hands-Free Human-Computer Interface That Senses Tooth/Jaw-Movements With Headphones


  • Krishna Jha Academies of Loudoun
  • Prahas Gantaram
  • Ethan Kantz



Headphone HCI, Hands-free HCI, human computer interaction


People with motor disabilities such as amputated limbs are often incapable of using computers traditionally. Instead, they rely on other ways of computer use relying on sense facial movements by sending the signals from the sensor to an interface, and finally translating the signals into computer commands. The goal of this research is to create a hands-free Human Computer Interface (HCI) that detects sounds near the ears from teeth and jaw movements (e.g., clicking or grinding), and translates it into computer commands. Our research aims to make computers accessible to a wider range of people, as it does not require one to use their hands. The novelty of the proposed HCI is how it gathers input, which it does by receiving signals strictly from headphones, making it inexpensive and user-friendly. Our goal is to create a proof-of principle setup to show the feasibility of this HCI. To achieve this, a pair of headphones were wired to an m-audio amplifier, and the sound settings on the computer were set so that both the speaker/microphone were using the m-audio amplifier. A script written with Octave software plotted the recorded signals from the amplifier (recorded with headphones). While the script was running, facial movements, such as a tooth click, or jaw opening performed by the user generated audio signals that were recorded by the computer. The script then mapped the frequency distribution of the recorded waves. Our work so far shows that headphones can indeed gather inputs of different facial movements.


Download data is not yet available.


Metrics Loading ...

References or Bibliography

Centers for Disease Control and Prevention. (2017). Amyotrophic lateral sclerosis. CDC.

Centers for Disease Control and Prevention. (2020). What is Muscular Dystrophy?. CDC.

Cohen, O., Doron, D., Koppel, M., Malach, R., & Friedman, D. (2019). High Performance BCI in Controlling an Avatar Using the Missing Hand Representation in Long Term Amputees. SpringerBriefs in Electrical and Computer Engineering, 93-101.

Murphy, D. P., Bai, O., Gorgey, A. S., Fox, J., Lovegreen, W. T., Burkhardt, B. W., Atri, R., Marquez, J. S., Li Q., & Fei, D. (2017). Electroencephalogram-Based Brain–Computer Interface and Lower-Limb Prosthesis Control: A Case Study. Frontiers in Neurology, 8.

National Aphasia Association. (2021, February 17). Aphasia Fact Sheet. NAA. Retrieved from

National Institute on Deafness and Other Communication Disorders. (2017). Aphasia. NIH.

Prakash, J., Yang, Z., Wei, Y. L., Hassanieh, H., Chaudhary, R. R. (2020). EarSense: earphones as a teeth activity sensor. Proceedings of the 26th Annual International Conference on Mobile Computing and Networking.

Shensa, M.J. (1992). The Discrete Wavelet Transform: Wedding the A Trous and Mallat Algorithms. IEEE Transactions on Signal Processing, 40(10), 2464–2482.



How to Cite

Jha, K., Gantaram, P. ., & Kantz, E. (2023). Developing A Hands-Free Human-Computer Interface That Senses Tooth/Jaw-Movements With Headphones. Journal of Student Research, 11(4).



Research Projects