When it comes to augmented reality technologies, visuals always seems to be a pretty essential part of most people’s definitions, but one startup is offering an interesting take on audio-based AR that also calls on computer vision. Even without integrated displays, glasses are still an important part of the company’s products, which are designed with vision-impaired users in mind.
Aira has built a service that basically puts a human assistant into a blind user’s ear by beaming live-streaming footage from the glasses camera to the company’s agents who can then give audio instructions to the end users. The guides can present them with directions or describe scenes for them. It’s really the combination of the high-tech hardware and highly attentive assistants.
The hardware the company has run this service on in the past has been a bit of a hodgepodge of third-party solutions. This month, the company began testing its own smart glasses solution called the Horizon Smart Glasses, which are designed from the ground-up to be the ideal solution for vision-impaired users.
The company charges based on usage; $89 per month will get users the device and up to 100 minutes of usage. There are various pricing tiers for power users who need a bit more time.
The glasses integrate a 120-degree wide-angle camera so guides can gain a fuller picture of a user’s surroundings and won’t have to instruct them to point their head in a different direction quite as much. It’s powered by what the startup calls the Aira Horizon Controller, which is actually just a repurposed Samsung smartphone that powers the device in terms of compute, battery and network connection. The controller is appropriately controlled entirely through the physical buttons and also can connect to a user’s smartphone if they want to route controls through the Aira mobile app.
Though the startup isn’t planning to part ways with their human assistants anytime soon, the company is predictably aiming to venture deeper into the capabilities offered by computer vision tech. The company announced earlier this month that it would be rolling out its own digital assistant called Chloe that will eventually be able to do a whole lot, but is launching with the ability to read so users can point their glasses at some text and they should be able to hear what’s written. The startup recently showed off a partnership with AT&T that enables the glasses to identify prescription pill bottles and read the labels and dosage instructions to users.
By Lucas Matney