Current technologies can provide many benefits for the visually impaired if Apple develops it further
Apple has yet to release or announce any sort of smart or augmented reality glasses, yet many rumors have been circling about its potential release. When Apple announced AR Kit, a software development kit for developers to create augmented reality applications for iPhone and iPad, many video game and furniture apps took the opportunity to use AR Kit as a means of bringing virtual items to real life. Furniture apps project their products into the user’s living room, and video games like Pokémon Go became immensely popular.
However, assuming Apple launches its AR glasses soon, Apple could use AR Kit and Siri with their smart glasses as a means of making things more accessible to those who are visually impaired or blind.
Reading could become immensely simpler
If Apple plays their cards right, by which I mean, if their developers combine AR Kit with Siri, they could quickly help the visually impaired identify text. Apple currently has a feature on iOS devices where Siri can read a webpage or PDF to the user simply by swiping two fingers down on the screen.
To a visually impaired or blind person, purchasing Apple’s rumored AR Glasses could be of immense benefit.
Currently, the cost of purchasing books in Braille is very expensive, with one book in Braille running around $100 or more. This is because books in Braille do not use ink. Instead, it uses a system of creating bumps on the page for the blind to read using their fingers. This uses a lot of paper since the letters must be big enough for Braille readers to understand, and obviously, it cannot be double-sided.
If a user wears the glasses when handed, say, a worksheet at school, the user could instruct Siri to read the worksheet to them (this is of course assuming Apple integrates a camera into the glasses). By advancing Siri’s intelligence, Siri could read the text to the user and pause when appropriate.
If Siri could be advanced even further, Siri could scan the document using its camera, read the text to the user, and ask the user questions from the worksheet. When the user responds, Siri could create a digital version of the assignment and type the responses the user provides. Siri could then ask the user if they want to e-mail the digital version of the assignment to their teacher, and the user would be able to complete their work.
This would allow the visually impaired to complete their work without requiring worksheets to be printed in large fonts, nor would families have to purchase equipment needed for reading or writing in Braille.
Moving around the house becomes more reliable
With Apple’s AR Kit, users have been able to identify objects using the camera on their iPhone or iPad. So far, Apple’s AR can indeed recognize household objects, such as cups or couches.
On that note, if a visually impaired user purchases Apple’s AR Glasses, they can easily use Siri to identify where something is. All the user has to do is ask Siri, “where is my cup?” and Siri can respond with “to your left” or “six inches in front of you”.
If Apple advances Siri’s intelligence and updates their AR Kit, similar to what Google’s Project Tango offered in 2014, users could even identify objects in their home. The user could ask Siri, “where am I right now?” and Siri could respond with “in the kitchen”. The user could follow up with “I want to go to bed”. Siri could respond “Okay, take ten steps forward”, and so on.
This would allow the user to walk around without being worried about foreign objects. Siri could tell the user an object is in the way and guide them around it without the use of a walking stick.
Siri and Apple’s AR Glasses combined with AR Kit could easily become eyes for those who are blind or visually impaired for families who cannot spend thousands of dollars on eye surgery.
Of course, this is all assuming Apple developers work to update Siri’s available responses and comprehension of questions, as well as Apple updating their AR Kit to provide better accessibility options.
While some corporations and local governments are now banning the use of facial recognition by the police, facial recognition can still be used ethically to benefit the public.
Using the camera on Apple’s Glasses could benefit the visually impaired or the blind if Siri identifies the person walking up to the user and notifies them. A simple “John is approaching” would suffice. Imagine how happy the user would be to acknowledge and greet the person as they are walking towards them!
Probably the riskiest one, but Apple could enhance their tools even further for users to leave their house on their own.
Granted, this is a far-fetched and dangerous idea, but let’s explore it anyway.
A user can walk out of their house, and Siri could guide them by telling the user how many steps ahead to walk. Once the user arrives at, for example, the supermarket, Siri could guide the user through each aisle and help them find whatever they need.
If the user tells Siri “I need to buy bread, ground beef, and sliced cheese”, Siri could scan the aisle numbers and its food categories to help identify where the items are located. Siri could then guide the user, telling them to walk forward, left, or right a certain number of steps, then tell the user the prices available of the product.
Siri could even read the text on the products to the user, letting the user know the different kinds of ground beef or cheese available, or whether the user wants wheat or white bread.
Once the user tells Siri they want to head over to the cash register, Siri can guide them there, then head home.
Should Apple release its Glasses within the next few years, it could provide accessibility options for the visually impaired and the blind, which would be of immense benefit. It would provide a better option for those that cannot afford to purchase books, equipment, or computers for writing or reading in Braille. A one time purchase of Apple’s AR Glasses would save the users hundreds or thousands of dollars in the long-term, and would quickly simplify their life.