A young deaf girl — maybe four or five years old — puts on a pair of ordinary-looking glasses at Universal Studios. Suddenly, a sign language interpreter appears in her line of sight, translating the ride experience into her native language in real time. She starts signing excitedly to her mum. She's not missing a thing.

"I can't tell you how these glasses are changing people's lives," says Monique Clark, COO of SignGlasses, who is CODA — a child of deaf adults. "People put on these glasses and the reactions — the tears — just being able to have this access."

The Technology: What's Actually Working

Several companies are now shipping smart glasses that help deaf and hard-of-hearing people communicate more easily, but they work in different ways — and the distinction matters.

SignGlasses, a US-based company partnering with hardware maker Vuzix, connects deaf users with remote human sign language interpreters and captioners. The interpreter appears on a transparent display built into glasses that look like a regular pair of specs. It's already deployed at both Universal Studios parks in the US, as well as universities and workplaces.

Crucially, SignGlasses uses certified human interpreters — not AI. "We want to make sure 100% accuracy," Clark told Hearing Health & Technology Matters. "There's accents, there's all these different things that can interrupt that process."

Meanwhile, XRAI Glass, a UK startup, takes a different approach. Its AI-powered glasses capture surrounding speech and display real-time text subtitles on the lenses. Founder Jack Bloomfield said the idea came from watching his grandfather struggle with hearing aids but instantly engage when TV captions were switched on. "They can simply put on a pair of smart glasses and immediately see the words being said around them," Bloomfield told GMA News.

Other players include RayNeo, whose X3 Pro AR glasses offer real-time spoken language translation across 100+ languages with subtitles projected into the wearer's field of view, and HearView, which focuses on captioning for the deaf community.

The Promise — and the Caveat

If this all sounds like the communication revolution deaf communities have been waiting for, there's an important caveat. The most headline-grabbing claim — that AI can translate sign language itself into text or speech in real time — remains largely aspirational.

Current caption glasses translate speech to text. Going the other direction — recognising the complex, three-dimensional grammar of sign language, including facial expressions, body positioning, and regional dialects — is a far harder problem.

The British Deaf Association laid this out plainly in a 2025 discussion paper: "AI BSL is not a wonder-technology, despite the enthusiasm of people with vested interests." The BDA flagged concerns about regional dialect recognition, racial bias in training data, and whether machine interpretation could ever be reliable enough for medical or legal settings.

RNID, the UK's largest hearing loss charity, echoed these concerns, noting that "accuracy requires big data which does not exist for signed languages" and that "current systems struggle with fine detail, like hands, which is essential for communicating meaning."

Even SignGlasses' Clark acknowledged the gap: "AI for automatic speech recognition is out there. For sign language? We're not there yet."

Why It Still Matters

Despite the limitations, what is working today represents a genuine leap forward. For a deaf student in a university lecture hall, caption glasses mean following along without craning to see a distant interpreter. For a child at a theme park, they mean experiencing the magic alongside everyone else. For an employee in a meeting, they mean participating in real time rather than catching up from notes afterwards.

The Jisc National Centre for AI noted that platforms like Signapse are developing AI-generated BSL translation for short-form content like public announcements and train departures — with human review still required. It's incremental progress, but it's real.

The BDA's message to developers is clear: "Deaf people should have leading professional roles in the development, design, delivery, deployment, and evaluation" of these tools. The technology works best when it's built with the community, not just for it.

For now, the glasses on that little girl's face at Universal Studios aren't powered by some AI miracle. They're powered by a human interpreter, smart hardware, and a platform that put accessibility first. And judging by her reaction, that's more than enough.