How accessible are Apple’s products?
What is Apple doing to make its devices easy to use for people with disabilities, and is there still more to do?
For a number of years now, it has felt like Siri has been struggling against its rivals. Amazon and Google seemed to be running away with the artificial intelligence game.
Then along came WWDC 2018, and with it a new Shortcuts app. This app, a key part of iOS 12, lets you create workflows that can be triggered by a specific phrase uttered to Siri.
This could be the shot in the arm that Siri so desperately needs, making it much more flexible than it has historically been. But it also comes with a benefit that perhaps wasn’t immediately obvious: it could be great for accessibility.
That’s because you can specify a phrase of your choosing. If you have a speech impediment or find it tricky to say certain things, Shortcuts lets you say something easier to achieve the same goal. If a disability makes it difficult to carry out a longwinded task, you can string several tasks together and carry them out with a simple phrase, letting you do things that would normally require taking multiple actions, perhaps even in multiple apps.
The Shortcuts app is an example of Apple building accessibility features into its ecosystem, and there are plenty of others. In fact, Apple has often made a point of increasing the accessibility of its products, with Tim Cook saying in a recent interview with TechCrunch that the company believes accessibility is a “fundamental human right”. Let’s see what else Apple is doing.
Facing the challenge
There were other features announced at WWDC 2018 that could have benefits for those in need of accessibility aids. One was the way Face ID works. When the iPhone X was revealed, Apple explained that you can set it to require you to look at the screen to unlock the device – security against you accidentally unlocking your phone. But if you’re blind or partially sighted, you may struggle to convince the iPhone X you are giving it your full attention, and thus may not be able to unlock it.
In a response to questions on this topic from Jonathan Mosen, an author and advocate for the blind, Apple explained that “The iPhone X has been designed with a number of accessibility features to support its use. For VoiceOver users, Face ID will prompt you as to how to move your head during set up in order to complete a scan. If you do not want Face ID to require attention, you can open Settings > General > Accessibility, and
disable Require Attention for Face ID. This is automatically disabled if you enable VoiceOver during initial set up.” VoiceOver is a function in iOS and macOS that gives you audio descriptions of on-screen elements and gives you hints to help you get things done. It’s ideal for blind or partially sighted people, so it’s interesting to see Apple taking it into account in a headline feature of iPhone X.
Apple has history in this area. Back in 2004, for example, the company introduced Spoken Interface for Mac OS X. This let users navigate through the operating system using a combination of speech, keyboard navigation and audible cues. More recently, at its iPhone launch event in 2016, Apple kicked the show off with a video highlighting some of the accessibility features baked into its products. The video was edited and narrated by Sady Paulson, a video editor with cerebral palsy.
In fact, accounting for accessibility needs has become ingrained in the way Apple designs its devices. Sarah Herrlinger, Apple’s Director of Global Accessibility Policy and Initiatives, described it this way: “We look at accessibility as something we integrate within the design process. It’s the same as everything else: the goal of making products that are really intuitive and easy to use.”
While there is a common perception that Google has Apple beat in the classroom thanks to its incredibly cheap Chromebooks, writer and accessibility advocate Steven Aquino says that’s not so when it comes to accessibility. “The [iPad’s] multi-touch user interface is far more intuitive, and more importantly, iOS is built with accessibility in mind,” he writes. “From VoiceOver to Dynamic Type to Switch Control and more, an iPad (or an iPod touch, for that matter) can provide a far more accessible and enriching learning experience for many students with disabilities than a Chromebook.” While much of the analysis of Apple’s success or failure in classrooms focuses on pure numbers, its accessibility nouse is often passed over.
In May 2018, Apple announced that it would launch its Everyone Can Code school curriculum in the autumn. Starting at eight schools in the US that support “students with vision, hearing or other assistive needs”, this would roll out to more schools around the world in due course, according to Tim Cook.
Accessibility is ingrained in the way Apple designs its devices
Everyone Can Code is a framework to help children learn to code in a simple and appealing way. It’s compatible with Apple’s VoiceOver tech, meaning you can learn to code without having to see the screen. As well as that, Everyone Can Code will allow students to use FaceTime in order to capture gestures and facial expressions, and it will incorporate features such as Type to Siri, closed captions, Mono Audio and LED Flash for Alerts (which fires your camera flash when you get a new notification). It will also work with Made for iPhone hearing aids.
Everyone Can Code works with Switch Control, which enables you to use your device with joysticks, switches and other methods. An input device can be designated as a switch, then be customised to have certain actions associated with it. For example, you could set the spacebar to scan through a list of items on the screen when you press it. Pressing another switch – which could be a different key, or clicking a mouse button, for example – could then select an item on the screen when the scan highlights it.
It’s not just Apple that’s making progress – third parties are using Apple’s tech to develop their own solutions. In many cases, these innovations wouldn’t have been possible without Apple’s hardware or software, or both.
For example, hearing aid company Cochlear has worked with Apple on developing an implant that works directly with your iPhone. The Nucleus 7 sound processor streams audio from a phone to the Cochlear implant, not only improving the quality of the audio but also making it very easy to adjust the settings by using the phone.
In May 2018, the USB Implementers Forum (USB-IF) announced a new standard for Braille displays. The USB-IF is a group of tech companies such as Apple and Microsoft that works to support the implementation of USB technology. Speaking of the standard, Apple’s Sarah Herrlinger said: “We’re proud to advance this new USB-IF standard because we believe in improving the experience for all people who rely on Braille displays to use their Apple products or any other device.”
These collaborations suggest that Apple is keen to work with other groups in order to promote the accessibility benefits not only of its own products, but of bringing greater accessibility to technology in general.
How can Apple improve?
Of course, there is still much that Apple can do. Aquino believes that Apple’s classroom strategy leaves room for improvement. “Apple is obviously – rightfully – building [its] educational strategy towards mainstream students in mainstream classes,” he writes. The company needs to complement that approach with an expanded toolset for teachers working in special educational classrooms, Aquino believes, in order to make sure they are not passed over in Apple’s efforts.
And while the Touch Bar is great for accessibility by making shortcuts much easier to use, Aquino also suggests that Apple could make it even better by integrating haptic feedback into it. That would provide help to visually impaired users who may not be able to accurately read the characters on the Touch Bar given their relatively small size.
Still, Apple has done a lot of good work in this area, ensuring that accessibility is a key part of the process when developing new technology and devices. Speaking in 2013, Apple’s CEO Tim Cook explained: “People with disabilities… frequently are left in the shadows of technological advancements that are a source of empowerment and attainment for others, but Apple’s engineers push back against this unacceptable reality, they go to extraordinary lengths to make our products accessible to people with various disabilities.” On the whole, that push seems to be working.
In many cases, these innovations wouldn’t have been possible without Apple
VoiceOver is built into macOS and can describe on-screen elements to you. It also works on your iOS device.
Cochlear’s Nucleus 7 hearing implant works closely with your iPhone to stream audio to you.