A million Americans have trouble hearing high-pitched sounds, especially vets and people working industrial jobs. But makers of a new hybrid cochlear hearing aid hope to help.
“Ok can you hear me?”
“So let’s just do the beeping sounds. Raise your hand every time you hear it, okay?”
Alison Singleton is an audiologist at New York University’s Cochlear Implant Center. She’s sitting in a small sound booth giving a hearing test to her patient, Karyn Reyher, on the other side of a pane of sound-proof glass. Alison is playing low, mid, and high-pitched warbling tones to Karyn.
“Karyn are you hearing it?”
“Is that it?”
“Yeah, that’s it.”
Bill Shapiro, the supervising audiologist, says Karyn has perfectly normal hearing at low frequencies, between 250 and 500 Hertz. But she has severe to profound loss at high frequencies. It’s sometimes called “ski-slope” hearing loss, because of the way a graph of the patient’s hearing looks.
“We know in speech that the vowels of speech are low frequency and the consonants are high frequencies,” says Shapiro. “So Karyn has all the access to low frequency sound she needs but she has no access to the high frequency consonants. And we know in the English language, consonants make up about 75 percent of the understanding of the word. The differences between “carve” and “car” and “cat” and “cap” are consonant differences.”
Karyn tried hearing aids when her hearing started to deteriorate in her early 30s, but turning up the volume still couldn’t amplify those crucial high frequency sounds of speech.
“So here I am, someone’s talking about FiOS, and I think that they’re talking about Paris,” says Karyn. “I’m answering them, ‘I love Paris! My vacation in Paris was the greatest,’ and, meanwhile, they’re saying, ‘Karyn what are you talking about?’ So it becomes really embarrassing socially for a person who can’t hear that you’re embarrassed, that you’re going to really flub up this entire conversation. So it’s a social issue as well as a medical issue.”
In 2011, Karyn had surgery on her left ear, to get what’s called a hybrid cochlear implant.
Cochlear implants have two parts—external and internal. The external component listens to ambient sounds with a microphone, then it translates that sound into electrical stimulation, using speech processing algorithms. That information then passes through radio waves to the internal device, embedded in the patient’s skull, which stimulates the inner ear using a string of tiny electrodes threaded into the cochlea.
“The cochlea’s filled with fluid and it’s also filled with hair cells, little hairs,” says Shapiro. “And if you think of the cochlea as a spiral staircase, the pole that holds the spiral staircase up in the middle is where the nerve comes out. So the job of the cochlea is to have those hairs move to this mechanical energy and transduce, or change, mechanical energy to electrical energy that the nerve understands. So what we’re doing is we’re taking the place of the hair cells by putting an electrode array into the cochlea.”
Traditional cochlear implants have been around for decades. But they were for people who’d lost all of their hearing, not people like Karyn. Her low frequency hearing was actually too good.
That’s where this new hybrid cochlear implant comes in, which Karyn has. It combines a hearing aid to boost the low frequency sounds patients can still hear with a cochlear implant, which replaces those missing high frequency sounds with electrical stimulation.
The FDA approved the device earlier this year, after reviewing the results of a clinical trial including Karyn and 49 other patients. Post-surgery, three quarters of the patients significantly improved their understanding of words and sentences. But the surgery does have risks. Everyone who’s getting this surgery still has decent hearing at low frequencies. And that might be erased by the surgery.
“When you put an electrode in an ear that’s supposed to work a certain way, you’re going to typically see that a patient’s going to probably lose some of their hearing,” Shapiro explains.
In fact, 22 of the 50 patients in the trial lost all—or nearly all—of what they had left of their natural low-frequency hearing in the implanted ear. But the device can be programmed to compensate for that. So for patients who suffered hearing loss during the surgery, their everyday hearing was still as good or better after the surgery than it was before.
But electrical stimulation still doesn’t capture all the nuances of the human voice.
“When I got connected, I could hear and I could understand, it just sounded like people were talking like robots,” Karyn remembers. “But hey, talking to robots and being able to understand robots is better than not being able to understand people at all!”
But the brain is flexible, and over time, it can transform robot speech into something a little more human. Karyn remembers her experience adjusting to the implant: “One day, probably a month [after the surgery], I said, ‘Wait a minute, people sound normal now!’ So it was just my ear adjusting, and in about a month or so, it really does start to sound like regular speech again.”
Bill estimates that about a million Americans have “ski-slope” hearing loss, like Karyn, and could benefit from this technology—especially military veterans, or people who’ve worked noisy, industrial jobs.
But it’s not cheap. Jay Rubinstein is an otolaryngologist and bioengineer at the University of Washington, and he says the typical cost for the device, surgery, hospitalization, and a year’s worth of audiology support can cost $70,000 dollars per ear. Karyn said the charge for her surgery alone was $118,000. Her insurance covered costs after she paid her out-of-pocket maximum, but many other private payers, including United Healthcare, don’t cover the hybrid implant yet—and neither does Medicare, because the device was so recently approved.
Music to Karyn’s ears
Three months ago, Karyn had a hybrid device implanted in her right ear, too, making her the first patient to have hybrids on both sides. And though her speech comprehension has improved, she says music sounds distorted and flat.
“Michael Jackson will never sound the same as he used to sound, because he was a high frequency singer. For me, although the second boosted my ability to understand, it changed the nuance a little bit for me.”
Take the sound of a cello, for example. The intricacy of the human ear is able to discern every nuance of the rich sounds. But for those listening through cochlear implants, the sound comes through as metallic and out of tune. The 22 electrodes on the implant can’t provide what Jay Rubinstein calls the “broadband spectral resolution” necessary to enjoy the complex harmonic ingredients of music. But he and his colleagues are working on that.
“We have speech processing strategies that can improve timbre perception,” says Rubinstein. “In theory, we could take a hybrid user, give them a strategy like this and turn them into a sort of a super cochlear implant user. They would not just have great speech perception in quiet and noise, and great melody perception through their low frequency acoustic hearing, but better timbre perception, also. They’d sort of be the superstars of the cochlear implant world.”
As for Karyn, she says she’s more than willing to forego enjoying music, in order to hear the voice of her three-year-old son.
“Children are known to have high frequency voices,” Karyn smiles. “And I could understand my son’s first words, and now I can understand him perfectly. Knowing what I know now, I would never be able to have that experience if I didn’t have the cochlear implant.”