What is the primary function of every mobile phone? More than fifteen years ago the answer to this question would have been unanimous: “Talking on phone“.
Before the smartphone revolution forever changed consumer technology (and by extension the contemporary world as we knew it), mobile phones were precise and descriptive in purpose. A phone, but one that could be carried everywhere. Tool up or tool down, no one could demand much more from a Nokia 3310.
Today that era seems sunk in the haze of our collective memory. What is a mobile phone for today? The answer will vary greatly depending on the person making it. In many ways, however, a smartphone today is a honing of our biological abilities. An expansion of our sight, a refinement of our hearing, a multiplication of our communication skills. It is possible to understand modern mobile phones as an extension, as an improvement of the human being.
If the previous sentence seems hyperbolic, it is probably because none of us lives with a disability. We don’t need the state-of-the-art cameras on our phones to see; we also don’t need your auditory recognition systems to hear; We can get around our house without the need to issue voice commands that activate this or that appliance. It is normal. trained people we don’t tend to imagine The world as people with disabilities live it. This does not mean that, for them, a mobile can be a projection of themselves.
Few technology companies seem to have understood the above with as much interest as Apple. Its Accessibility department has been around since 1985, but it didn’t reach its peak of popularity (and company centrality) until the iPhone. This is where Apple molds its great consumer products to be adaptable to people with disabilities. Behind the functionalities that accompany their phones there are always applications that try to adapt technology to blind or deaf people.
It is a corporate policy. One that has earned him a very good reputation among communities with disabilities.
What is your motivation? answers us Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “Our first office dedicated to disabilities opened in 1985. To give some context, the first American with Disabilities Act was passed in 1990. I think it’s a good way to illustrate that our commitment to this issue is not It’s because there was a political regulation that forced us to do it,” he explains proudly.
Herrlinger knows what he’s talking about. His duties include running Apple’s accessibility programs. “When we think about accessibility we don’t think about building for individuals,” he develops her, “but it’s about building tools that serve all kinds of alternative uses of technology by people.” In this locution (“alternative uses”) is the key to accessibility: a blind person can use a smartphone, however striking it may be, only in a very different way from ours.
“See” literally with an iPhone 14
Obviously, Apple brandishes its iPhone as the ultimate argument for its commitment to people with disabilities. All versions of iOS have included specific functionality of some kind (the release of the first iPhone, in fact, was met with great joy and surprise from the blind community), receiving improvements with each update.
The iPhone 14 settings screen, the latest version of the phone, includes a specific chapter dedicated to Accessibility. The tools vary depending on the disability. VoiceOver is one of the most complete and also one of the most famous. Once activated, the phone reads the text on the screen aloud, so that it is possible to understand what the phone is displaying even without sight. It is highly customizable (from reading speed to tonality or pronunciation) and has dozens of features.
Braille writing is perhaps the most impressive of all. Daniela Rubio has been an Apple Distinguished Educator (a company representative who teaches other users to use the advanced features of the iPhone) for a decade. She is also blind. His handling of the phone is hypnotic. VoiceOver narrates her position on her screen at all times as she activates and deactivates commands both with her voice and with a braille keyboard.
There’s virtually nothing you can’t do with the phone.
“We can type in braille on the screen,” he explains as he moves his fingers at full speed. The keyboard works on six pressure points (three on each hand: Rubio holds the phone horizontally, as if it were the control of a game console). “It is calibrated based on your fingerprints, because you can have very large or small hands, like a child’s,” he adds.
Rubio believes that the possibilities of the iPhone are immense, but that the blind community often fails to exploit them. It is common, for example, that many blind people use smaller versions of the iPhone (mini) because… Why would they need large-inch screens if they can’t see it? For Rubio it is a wrong approximation: A smaller and older phone is also a phone with a worse camera (for the image recognition system) and features.
(Dimitri Karastelev/Unsplash)
The braille keyboard is another good example. “It’s amazing because Apple has included different types of commands that allow it to be used as a normal keyboard,” he continues, “but it’s a feature that the blind community sometimes ignores because of course we have the dictation feature.” And dictating is easier than writing. For Rubio, however, the braille keyboard allows write at high speed in a method of writing and reading that most of them use beyond their mobile phones.
Deaf and blind people are Apple’s natural target when it comes to accessibility, largely because their disabilities are relatively universal. But what about the many other people with disabilities whose problems are rare or uncommon? “Accessibility settings,” Herrlinger replies, “is the most robust section of all of our devices because it offers so many different ways to configure and customize.” Vision and hearing, yes, but also motor or cognitive functions.
This integration of the different existing capacities has its reason for beingaccording to Herrlinger, and dates back to the founding of the Accessibility office almost forty years ago: “I think what sets us apart from others is the way we think about accessibility. It’s not a compliance issue, it’s not try to check a box or do the minimum amount of work to comply with a regulation.”
In this sense, the iPhone 14 works as a claim and a flag at the same time. Its accessibility features also encompass physical and motor skills. AssistiveTouch allows you to adapt the touch of the phone to the needs of someone, for example, with a certain motor reduction; switch control allows you to use the iPhone by sequentially activating the elements on the screen; and FaceID makes the unlocking process as simple as possible for someone who, say, doesn’t have arms.
Apple usually presents new features every year.
The chicken or the egg?
FaceID is a good example of Apple’s relationship with its accessibility features. The facial recognition system has been one of the main attractions of the iPhone in recent years, especially for its exquisite precision when it comes to unlocking the screen or allowing payments instantly. It is also one of the main arguments for the company to promote its phone over the competition.
But it is also something else: a tool of incalculable value for millions of people with disabilities. Did it and many other accessibility functions come from the commercial drive of the company or were technological innovations considered from the outset with accessibility in mind? Herrlinger thinks about it for a few seconds. “I think it’s a mix. It’s not necessarily one or the other,” he replies.
Part of that ambiguity stems from the way Apple works with disability communities, he argues. Blind, deaf or people with mobility limitations work directly in the Accessibility area. “Sometimes it is our own employees who say: I would really like the mobile to do this. For example, the function of detecting people, of understanding how close they are, came from one of our employees who is a member of the blind community,” he adds. .
Herrlinger refers to Magnifier, one of the most well-rounded tools on the iPhone in terms of accessibility. his perfecting accelerated during the pandemica time marked by social distancing, something that, in his opinion, underlines the strong feedback under which his department operates: “We quickly understood how valuable it was for someone to know how close you were to another person. In that sense, the Feature development is very community driven.”
In other words, there is a synergy: Apple works on innovations and Accessibility seeks a way to integrate them into a comprehensive program dedicated to people with disabilities. In that sense, Magnifier is revealing: its detection mode uses a LIDAR scanner that reads the spatial environment around the phone, relaying the information back to its user. A blind man can move seeing what is around him thanks to him; can detect a door or a step; and you can also choose a red dress for a green one.
Putting super-complex tools on a phone is all very well, but it’s only part of the story. It is useless if its users do not know how to use them. This is where the role of people like Rubio becomes crucial for Apple: “At first, when my students pick up an iPhone for the first time, they are scared because it is flat. It has no buttons. But there is a learning curve. It’s easy to imagine the gestures because they are very intuitive.” They can also transfer them once customized to other Apple devices.
There is also an outreach exercise. Accessibility is no stranger to the great dilemma of consumer technology (the greater the complexity, the greater the difficulty permeating the mass user). “At first [de los smartphones y del iPhone] It was very shocking because many people were not aware of accessibility features. Today most blind people use iPhone or iOS devices. And they use technology, but I think there are many things that don’t explode because they don’t know how to use them,” he laments.
His work, he stresses, is to educate them. Should you use simpler functions? “Yes, many blind people have the iPhone SE because it’s simpler. But if I had the SE, I wouldn’t be able to detect other people when they approached,” she reasons. He also refers to the iPhone 14 camera, much more complete and precise. The addition of higher resolution and VoiceOver allows you to photograph your surroundings and hear a description of what you captured. She opens a photograph on her mobile phone, of her children, and the phone narrates it in great detail. It is a fascinating exercise because of the naturalness and precision with which she executes it.
“If I have a better camera, I’ll be able to see better. It’s like having a pair of little eyes,” he adds with a laugh. VoiceOver, meanwhile, continues to explain the image and spatially locate all the elements of the photo. Her children included.
Imagen: Apple