April 24, 2025
Gadget

With a mobile phone, it is possible for a blind person to “see” and a deaf person to “hear”. Working on Apple 1 comment

  • February 18, 2023
  • 0

What is the primary function of each mobile phone? Over fifteen years ago, the answer to this question would have been unanimous: “Call“. before the revolution smart phone

With a mobile phone, it is possible for a blind person to “see” and a deaf person to “hear”.  Working on Apple 1 comment

What is the primary function of each mobile phone? Over fifteen years ago, the answer to this question would have been unanimous: “Call“.

before the revolution smart phone No matter what consumer technology (and, by extension, the contemporary world as we know it) changes forever, cell phones were precise and self-explanatory in their purpose. A phone, but a phone that can be carried anywhere. Tool up or down, no one can demand more from a Nokia 3310.

Today that period seems to be shrouded in the haze of our collective memory. What is a cell phone for today? The answer will vary greatly depending on the person making it. However, in many ways a smart phone today it is an enhanced version of our biological abilities. Our field of vision expands, our sense of hearing improves, and our communication skills increase. It is possible to understand modern mobile phones as an extension of human beings.

If the previous sentence seems exaggerated, it’s probably because none of us live with disabilities. We don’t need high-end cameras on our phones. To see; We don’t need your auditory recognition systems either. to listen; We can walk around our house without needing to issue voice commands that activate this or that device. This is normal. educated people we don’t tend to dream They live the world as disabled people. This does not mean that for them a mobile phone can be a reflection of themselves.

Few tech companies seem to have understood the above with as much interest as Apple. The accessibility department has been around since 1985, but it didn’t reach the peak of its popularity (and the company’s headquarters) until the iPhone. This is where Apple shapes its great consumer products to be adapted to people with disabilities. Behind the functions that accompany their phones are always apps that try to adapt the technology to people who are blind or deaf.

This is a company policy. Someone who has earned him a very good reputation among the disabled communities.

What is your motivation? answers us Sarah Herlinger, Apple’s Senior Director of Global Accessibility Policy and Initiatives. “Our first office for people with disabilities opened in 1985. To give some context, the first Americans with Disabilities Act was passed in 1990. I think it’s a good way to show that our commitment to this issue isn’t because it’s a political regulation. That’s what forced us to do it,” he explains. proudly.

Herrlinger knows what he’s talking about. His duties include running Apple’s accessibility programs. “When we think of accessibility, we don’t think of building for individuals,” he develops, “it’s about building tools that serve all kinds of alternative uses of technology by humans.” In this phrase (“alternative uses”) is the key to accessibility: a blind person smart phonehowever striking, just in a very different way from ours.

Literally “See” with an iPhone 14

Frankly, Apple is touting the iPhone as the ultimate argument for its commitment to people with disabilities. All versions of iOS included some kind of specific functionality (in fact, the release of the first iPhone was met with great joy and surprise in the blind community), receiving improvements with each update.

The iPhone 14 settings screen, the latest version of the phone, includes a special section devoted to Accessibility. Tools vary by disability. VoiceOver (current value) one of the most complete and also one of the most famous. When enabled, the phone reads the text on the screen aloud, so it’s possible to understand what the phone is displaying even without seeing it. It is highly customizable (from reading speed to tonality or pronunciation) and has dozens of features.

Braille script is perhaps the most impressive. Daniella Rubio this is a Apple Distinguished Educator (a company representative teaching other users how to use the advanced features of the iPhone) for ten years. He is also blind. Holding the phone is hypnotizing. VoiceOver always tells its position on the screen as it turns both voice and commands with the braille keyboard on and off.

There’s almost nothing you can’t do over the phone.

“We can type braille on the screen,” he explains, moving his fingers at full speed. The keyboard works on six pressure points (three in each hand: Rubio holds the phone horizontally, as if controlling a game console). “It’s calibrated to your fingerprint because you can have very large or small hands like a child’s,” he adds.

Rubio believes the iPhone’s possibilities are enormous, but the blind community often fails to take advantage of them. For example, it’s common for many blind people to use smaller (mini) versions of the iPhone because… Why would they need large inch screens if they can’t see? This is for Rubio wrong approach: A smaller and older phone is also a phone with a worse camera (for image recognition system) and features.

iphone14 accessibility

(Dimitri Karastelev/Unsplash)

A braille keyboard is another good example. “This is great because Apple includes different kinds of commands that allow it to be used as a regular keyboard,” he continues, “but this is a feature that the blind community sometimes overlooks because we have dictation.” And dictation is easier than typing. But braille keyboard for Rubio, write at high speed with a method of writing and reading that many use beyond cell phones.

Blind and deaf people aim Apple is natural when it comes to accessibility, largely because its barriers are relatively universal. What about many other disabled people whose problems are rare or unusual? “Accessibility settings,” Herrlinger replies, “are the most robust part of all our devices, as they offer so many different ways to configure and customize them.” Vision and hearing, yes, but also motor or cognitive functions.

This integration of the different capabilities available has a reason to existAccording to Herrlinger, and it dates back to the founding of the Accessibility office almost four decades ago: “I think what sets us apart is the way we think about accessibility. It’s not a compliance issue, it’s not trying to tick a box or do a minimal amount of work to comply with a regulation.”

In this sense, the iPhone 14 acts as both a claim and a flag at the same time. Accessibility features include physical and motor skills. AssistiveTouch allows you to tailor the touch of the phone to one’s needs, for example, with a certain motor reduction; toggle control allows you to use the iPhone by activating the items on the screen sequentially; and FaceID makes unlocking as simple as possible for someone without arms, for example.

Apple usually releases new features every year.

Chicken or egg?

FaceID is a good example of Apple’s relationship with accessibility features. The facial recognition system has been one of the main attractions of the iPhone in recent years, especially due to its excellent precision in unlocking the screen or allowing instant payment. It is also one of the main arguments that the company makes its phone stand out from the competition.

But it’s also something else: a tool incalculable value For millions of people with disabilities. Did these and many other accessibility functions come from the company’s business impulse, or were technological innovations considered with accessibility in mind from the start? Herrlinger thinks for a few seconds. “I think it’s a mix. It doesn’t have to be one or the other,” he replies.

apple accessibility

He argues that some of this uncertainty stems from the way Apple works with its disability communities. People who are blind, deaf or have limited mobility work directly in Accessibility. “Sometimes our own employees say: I really want the cell phone to do this. For example, the function of detecting people, understanding how close they are, came from one of our employees who is a member of the blind community,” he adds.

Herrlinger talks about Magnifier, one of the most comprehensive tools on iPhone in terms of accessibility. its perfection accelerated during the pandemicA time marked by social distancing, according to him, underscores the strong feedback his department is working under: “We quickly realized how valuable it is for someone to know how close you are to another person. In that sense, Feature development is community driven.”

In other words, there is synergy: Apple is working on innovations and Accessibility is looking for a way to integrate them into a comprehensive program for people with disabilities. In this sense, Magnifier is self-explanatory: its detection mode uses a LIDAR scanner. To read the spatial environment around the phone transmits the information to its user. Thanks to him, a blind man can act by seeing those around him; can detect a door or a step; You can also choose a red dress instead of a green dress.

It’s fine to put super-complex gadgets on a phone, but that’s only part of the story. It’s useless if its users don’t know how to use them. For Apple, the role of people like Rubio becomes crucial here: “At first, when my students pick up an iPhone for the first time, they’re scared because it’s flat. It doesn’t have a button. learning curve. It’s easy to imagine gestures because they’re so intuitive.” They can also be exported to other Apple devices once customized.

There is also an introductory exercise. Accessibility is no stranger to the great dilemma of consumer technology (the greater the complexity, the greater the difficulty of penetrating the mass user). “First [de los smartphones y del iPhone] It was very shocking because many people were not aware of the accessibility features. Most blind people today use iPhones or iOS devices. And they’re using technology, but I think there’s a lot of things that don’t explode because they don’t know how to use them,” he complains.

His work emphasizes is to train them. Should you use simpler functions? “Yes, a lot of blind people have the iPhone SE because it’s simpler. But if I had the SE, I wouldn’t be able to detect other people when they came close,” he explains. It also refers to the iPhone 14 camera, which is much more complete and precise. The higher resolution and the addition of VoiceOver allow you to take pictures of your surroundings and hear a description of what you’ve taken. He opens a photo of his children on his cell phone and the phone tells it in every detail. This is a fascinating exercise because of the spontaneity and precision it applies.

“I can see better if I have a better camera. It’s like having a pair of small eyes,” he adds with a laugh. Meanwhile, VoiceOver continues to annotate the image and spatially position all elements of the photo. Including your children.

Image: apple

Source: Xataka

Leave a Reply

Your email address will not be published. Required fields are marked *