Revolutionary Diagnostic Technology Addresses Decades-Old Testing Gap

Utah State University researchers are pioneering a dual approach to transform support for students with hearing challenges through innovative artificial intelligence applications. Dr. Aryn Kamerer, assistant professor in USU’s Department of Speech and Hearing Sciences, has been awarded a three-year $500,000 Early Career Research grant from the National Institute on Deafness and Other Communication Disorders within the National Institutes of Health to develop AI-powered diagnostic tools that could revolutionize how educators identify and support students with hearing difficulties.

This initiative comes at a critical time when current testing methods haven’t changed since the ’40s and can only detect if patients can hear quiet sounds but provide little else about where in the auditory system breakdown is occurring. Simultaneously, Karen Muñoz, professor and head of the Department of Communicative Disorders and Deaf Education, has received a grant from Sonova, a Switzerland-based hearing device company, to develop an app that leverages artificial intelligence to support the parents of children who use hearing aids.

The research team led by Kamerer aims to train artificial intelligence to distinguish auditory diseases in the inner ear and brain, addressing a critical weakness in current educational support systems. According to child development expert Laura Lurns, this breakthrough could fundamentally change how schools identify and support students with subtle hearing challenges that significantly impact learning.

“We’re looking at a paradigm shift that could identify children who are struggling academically not because of cognitive issues, but because of undiagnosed hearing problems that current tests simply can’t detect,” Lurns explains. “When children can’t process auditory information effectively, it affects everything from reading comprehension to social interaction.”

Kamerer noted that “Up to 10% of patients who get their hearing tested because of hearing concerns end up with a diagnosis of normal hearing,” yet these students continue to struggle in classroom environments. The AI-powered diagnostic tools being developed could identify issues like auditory nerve damage that manifest as difficulty understanding speech in noisy environments—a common challenge in busy classrooms.