Much depends on the individual and their medical history, but even with cochlear implants or hearing aids, it takes concentration to decipher speech. Some sounds and words are so similar that it is extremely difficult to distinguish them. For people who rely on lip reading, picking up every word is impossible. Only about 40 percent of the sounds in the English language can be seen on a speaker’s lips, according to the US Centers for Disease Control and Prevention, and that’s under ideal conditions.
The prospect of transcribing audible speech in your field of vision is exciting. It can help people with varying degrees of hearing loss, who suffer as a result of social isolation, pick up more of a conversation. The XRAI app also works while watching TV, which can be handy for live content where subtitles aren’t always great (or in the cinema, where subtitles are missing).
But there are some big caveats here. The XRAI app runs on an Android smartphone that must be connected via USB-C to the Nreal Air Augmented Reality glasses, which cost $379. Yep, you’ll have a wire running down your body from head to toe. In addition to the cost, wearing glasses can be inconvenient if you have cochlear implants or hearing aids. Although relatively light for augmented reality glasses, the Nreal Air are still chunky and heavy compared to normal glasses. I can’t imagine wearing them all day.
Another red flag? One of the main reasons someone with hearing loss might want subtitles like this is for noisy environments like coffee shops or for group conversations where there’s a lot of cross-talk, but Feldman insists we go somewhere quiet for the demo and acknowledges that XRAI Glass t works well with Background noise or multiple people talking.
Then there’s the cost, and I’m not talking about Nreal glasses. The XRAI Glass Essentials tier is free and offers unlimited transcription and one-day conversation history, but if you want 10 hours of speaker attribution, 30 days of conversation history, and the ability to pin subtitles and customize the user interface, you’ll need Premium Tier that is free for one month then jumps to $20 per month. For unlimited speaker attribution, unlimited conversation history, and a “personal AI assistant,” you have to spend $50 per month for the Ultimate tier. That’s a lot of money.
The idea of subtitles for real life has been around for a while. Google published research on wearable subtitles a few years ago and tested the possibilities of real-time translation in augmented reality glasses at its last I/O developer event. A company video shows AR glasses translating languages in real time and subtitling speech for the deaf. Google tells me it’s not ready for prime time, and there are problems making the experience comfortable for people who read text projected into their field of vision.
Based on my short demo, XRAI Glass does not solve these issues. Wearing thick, expensive glasses and having subtitles floating in the center of your vision is not ideal. (You need a paid subscription to pin subtitles in 3D space, but I haven’t seen that.)