Much is made of how easy it is for sighted people to pick up and intuitively learn how to use iOS devices. There is somewhat more of a learning curve for blind people who wish to achieve a similar degree of total control. First made available for iOS devices when the iPhone 3 GS was released in June 2009, VoiceOver has matured into a robust combination of screen reader and interface. Simply put, VoiceOver lets blind people use iOS devices with their hands touching the screen or via external devices. It reads information displayed on the screen and allows blind people to control their device completely and with competence.
People who can see are able to simply tap on a desired icon to activate it. This would be disastrous for blind people. They need to be able to explore the screen without accidentally causing things to happen. VoiceOver must, therefore, change how your device behaves so that you can feel around and find the correct letter, number, or icon, and then tap twice on it quickly to indicate your intentional and informed choice.
There are also special gestures that make controlling things easier. For sighted people, these gestures would be counterintuitive and harder to master than the more immediate effects which are possible when you can see everything at a glance before interacting with your device.
VoiceOver does what other screen readers like NVDA for Windows computers do. It has a philosophy that encourages you to explore the screen with your finger and thereby gain an understanding of where things are. If there’s one thing to always bear in mind, it’s that you should never presume that everything is read out automatically. Always take the time to fully explore the screen of an app to make certain you know about all that is present, including available options. It is possible to reach everything by flicking left and right between elements. They will be moved to in left to right top to bottom order provided they aren’t obscured or hidden. However, that can prove a slow and tedious way to navigate a busy screen full of information and controls. Learning where things are allows you to touch an area on or near the element of interest and flick quickly to what you want. This also gives you, as a blind person, the ability to truly have a sense of how an app is laid out on the screen. Keep in mind that some elements, such as buttons or options, only actually appear while something else is in focus or after something has happened.
VoiceOver doesn’t always read changes out loud automatically, so I encourage people not to rely exclusively on flicking left and right. VoiceOver does a lot, including changing how your iOS device responds to touches and other gestures.
Using VoiceOver lets me read a web page, fill in a form on that page, and interact. Siri, on the other hand, would simply take me to the page and leave me stranded if sighted help or VoiceOver weren’t available. I can surf the web with ease using several browsers, filling out forms and shopping online. I can write a document, such as this guide, using an award-winning writing app called Ulysses due to its developers adding support for VoiceOver. While in the Word of Promise Bible app, I can quickly navigate and read the Old and New Testaments with ease and efficiency. Because of VoiceOver, I can play a complex game like Six Ages: Ride Like the Wind, reding the results of my moves and selecting options with as much skill and confidence as any sighted player. Provided an app supports VoiceOver, I can make use of its controls and interface. This even includes really visual apps such as the Camera app and the Maps GPS app. While much of VoiceOver’s capability is in the user’s control, a great deal of how well it works with various apps is actually in the hands of more conscientious and disability-aware app developers. Apple provides many options they can use behind the scenes to effect how well VoiceOver and other accessibility tools work with their apps. Not being a programmer, I have only a very basic grasp of the level of effort required to make use of these options or exactly how they work. I don’t have a sense of how much extra an app developer has to know about and do in order to create more delightful, intuitive experiences for VoiceOver users. My only insights into this come through using a great many apps over many years, plus the contact I’ve had with app developers when issues have come up. Therefore, this guide is mainly focused on elements of VoiceOver that are within the user’s control. However, it’s important for people to be aware that proactive developers who take the time to understand how VoiceOver works have a lot of latitude to optimize the experience of blind users. Should you find an app to be inaccessible, make certain developers are aware of the ability they have to broaden their potential audience by using accessibility tools including VoiceOver. You and they can find out a lot about how things work behind the scenes by looking at the developer resources found at www.apple.com/accessibility.
Typing on a touchscreen is a perfect example of both how deeply VoiceOver changes the iOS experience and how flexible it can be. For sighted people, simply tapping on a letter or number found on an app’s onscreen keyboard would result in that character being entered. Such a situation would be quite untenable for a blind person, especially while he or she was just learning where keys were. Therefore, VoiceOver provides three alternative typing modes or styles. Overall, the philosophy of VoiceOver is to offer more than one approach to things so that people with varying abilities can ultimately accomplish tasks. There may be methods that are faster for most people, but you shouldn’t feel that there is a right or wrong method. Find what works best for you among the options provided. It’s well worth your time to experiment and try different methods again once you’ve gained more general proficiency. Through this process of trying methods out, you’ll come up with your own best practices.
VoiceOver recognition capabilities
In iOS14, the Apple accessibility department decided to take a big step to increase the chances that an app might be useable even when a developer’s efforts in that direction came up short. Building on the image description work they previously incorporated, VoiceOver now uses the artificial intelligence onboard more modern devices to gain the ability to recognize text, better describe images, and even recognize other onscreen elements such as controls. Images are now described in more natural full sentences. Any text found in images or on control elements can be read by optical character recognition.
Furthermore, artificial intelligence is used by VoiceOver to attempt to recognize different controls and other onscreen elements of apps which haven’t been properly labelled by developers for best accessibility results. This will definitely make previously inaccessible apps useable for blind people. However, it also increases the output of VoiceOver as it describes and notifies its users of things which they were previously unaware of.
IOS 15 gave us further refinements of this more active use of artificial intelligence. The text and image recognition are now at a point when they’re extremely useful and responsive. I’ve rarely found them to be an annoyance. They are often quite useful. Especially when you’re on websites or using social media, where you’re likely to run into pictures and images. However, screen recognition is another matter. I recommend that it be kept turned off, since it frequently makes otherwise accessible apps impossible to use. Every once in a while, I’ve come across a situation where it actually proves helpful recognizing a control or button in an app which I otherwise can’t find. However, if a developer has done a lot of work making an app pleasantly accessible with VoiceOver, the screen recognition doesn’t seem to grasp this and undoes that work.
I feel compelled to point out the following to lazy developers. This artificial intelligence based element recognition system doesn’t mean you don’t have to put in some effort to become familiar with the accessibility tools that Apple has provided. In order to make apps more accessible and delightful to use for people who are blind, developers who know their apps best must put in an effort. Artificial intelligence doesn’t do it all for you. There’s not even any assurance that these features will even make an app truly useable, let alone enjoyable to use. The experience will always be better if developers do the work to make them that way. I may very well land on an element and learn that it’s an image of a hamburger. However, if that image isn’t labelled as a button or with the word “menu”, I won’t know to double tap on that image or that it’s the way to access a menu of options. Moreover, if I have to sit through long blasts of information in order to understand what’s happening, I will most assuredly be looking for an alternative app. One whose developer paid better attention to how enjoyable and practical my experience as a blind user was.
It’s quite possible for an app to be technically accessible but utterly impractical and painful to use for blind people. The degree to which you will be able to make use of this powerful new capability hinges on how modern your device is. Any device older than an iPHONE 7 will not have the computing power to take advantage of this. It requires the A10 Fusion chip at the oldest. Newer processors will affect how well these features end up working for you. People with older devices may well find that it introduces enough of a delay in responsiveness or other issues that it makes more sense to disable VoiceOver recognition capabilities entirely. I often find that to be preferable when I’m using apps which developers have put serious effort into VoiceOver support. This can be done in VoiceOver settings. Alternatively, you can make use of options on your rotor to turn VoiceOver recognition features on or off in various apps as desired. VoiceOver will keep track of your current choices in this regard, so you won’t have to keep taking the time to indicate your recognition preferences as you switch between apps.
Settings can also be placed in the new VoiceOver quick settings introduced in iOS15. These are accessed via a two finger quadruple tap. The features themselves are automatic for the most part. As you come to images, they will be described if you pause on the image in question long enough. The same applies to any text which might be inside images. It will be read aloud after a pause. The pause isn’t a long one. Around a second and a half is plenty of time to wait. Alternatively, you can set the recognition feedback to indicate the presence of information with a short sound. People should understand that unleashing artificial intelligence in this way is no trivial exercise. This is a big step forward for VoiceOver. Possibly, the most ambitious capability added to date. I expect that it will require a few years to reach a state of polished maturity. Feedback from users will be absolutely crucial in balancing the amount of information offered with the need for productive speed in the use of apps which require VoiceOver recognition to be useable. Improvements in this area will be more subtle for the most part. They won’t likely result in new options very often. Instead, people who pay attention to image descriptions, for instance, may notice the quality of those descriptions improving over time. A lot of work has been done on making the artificial intelligence use full sentences to construct descriptions rather than phrases.
Recognition of controls and elements in less accessible apps is another key area where the average user may not be aware of improvements as they occur and are released in subsequent versions of iOS.
General help and hints
VoiceOver conveys tremendous control and capability to blind users. However, it is a far from perfect system. There is a substantial learning curve for new users, with little help provided in a way that beginners might easily find out about. While there are good instructions on how to use VoiceOver in the iOS User Guide, you need to have a basic grasp of VoiceOver before you’ll be able to access them using your iOS device.
Apple does provide accessible Braille and tagged HTML copies of the user guides on their website at www.apple.com/accessibility. However, these aren’t always kept fully up to date. Also, nothing points people in that direction when they purchase these devices.
While there is a tutorial for owners of Apple Macintosh computers that takes beginners to a level of basic competence with VoiceOver, there is no such similar facility for iOS. This would, I believe, help new users tremendously. I have advocated for this to be created more than once. Without sight, the initial learning curve is simply too steep for too many people. That initial difficulty is deceptive, though. Once the basics are understood, things start to be very intuitive and feel natural. Apple should and could do a lot to help beginners get up to speed more easily. They could create a tutorial far better than what I can put in this guide. Moreover, they could make it easier to find out about accessible apps in the app store. To help beginners learn how to do things,
VoiceOver does offer a Practice mode that lets you safely practice gestures or key commands if you’re using a Braille display or Bluetooth Keyboard. You can enter and leave this mode by tapping the screen twice quickly with four fingers at once. While in this mode, VoiceOver will respond to any touches or gestures by briefly explaining what that gesture would accomplish outside of Practice mode.
Another extremely useful facility included with VoiceOver are Hints. These are enabled by default and give new users small doses of instruction as they navigate the screen. If, for some reason, you aren’t hearing helpful sentences like “slide up and down with one finger to adjust the value”, or “double tap to open”, then you have somehow managed to disable Hints. Here is how to reach that setting and enable them:
how to enable VoiceOver hints
- Presuming you’re finished setting up and your device is connected to cellular data or Wi-Fi, you can simply hold in the Home or Side button until a beep is heard and say “show VoiceOver settings”. Siri will then take you there, provided that Siri understood your words.
- If Siri isn’t available to you, find the Settings app on your home screen and double tap that.
- Flick right through the settings until you come to Accessibility and double tap that.
- Next, flick right until you find VoiceOver and double tap that. This works when VoiceOver is activated. If you’re instructing a sighted person how to do this, they simply need to tap once on the icons mentioned above.
- This area allows you to configure how VoiceOver behaves. We’ll be coming back here often while learning about VoiceOver. Flick right until you hear Verbosity and double tap this. Then, flick right until you hear hints. It will indicate whether they are enabled or disabled. Double tap on the Hints option to toggle it on or off.
- To exit this section of Settings, you can hit the Back button to return to the VoiceOver settings, or else hit the Home button or use the equivalent gesture to exit Settings altogether.
There are many settings you can adjust that change how VoiceOver behaves and what it speaks. They are all found in the VoiceOver section of the Settings app, with the important exception of one. This exception is called the Accessibility Shortcut, and before we go any further, we will set that to VoiceOver. This will let you turn VoiceOver on or off via the Home or Side button by pressing it three times quickly.
Useful in countless situations, such as if VoiceOver stops speaking due to a bug in the system, or when using apps that are self-voicing and don’t need VoiceOver running to be used. Many games do this. Here’s another use case. You may want to turn off VoiceOver quickly to let a sighted friend make use of your iPhone. To set this shortcut:
How to Set the Accessibility Shortcut to VoiceOver
- Find and double tap the Settings app.
- Flick right until you come to Accessibility and double tap this.
- Flick right through all the settings until you come to Accessibility Shortcut and double tap this. It is the very last setting.
- Flick right until you come to VoiceOver and double tap this.
- Tap the Home button to get out of Settings, or use the Back button at the top left of the screen to go back to the previous branch of the settings tree.
Getting to VoiceOver settings can be done in a few ways. Presuming you’re all set up and connected to the Internet, the quickest way to get there is:
- Hold the Home or Action button down until you hear a beep.
- Say “show VoiceOver settings”.
- You should be placed in the correct section within the Settings app, regardless of where you were before you invoked Siri.
- Otherwise, from your home screen, double tap the Settings app.
- Flick right until you hear Accessibility and double tap this.
- Flick right until you hear VoiceOver On Button and double tap this.
There are many settings that change how VoiceOver behaves to suit your liking. This includes everything from how rapidly the voice speaks, to whether punctuation is announced, to which style of Braille is used if you connect a Braille display to your device, etc. We will discuss all of these in a later part of this guide.
The three typing modes and typing feedback
Right from the beginning as you set up your device, understanding how to type using the onscreen keyboard is absolutely essential. Doing this efficiently can be quite a challenge, especially for blind beginners who are used to typing on physical keyboards. In this subsection, we will examine the three approaches offered by the VoiceOver interface. We will also cover the options for receiving feedback while typing.
My advice is to take it slow and careful while getting the hang of things. I kept wanting to fly along at the same sort of speed I can manage on a physical keyboard. This was never a reasonable expectation. Even after years of constant practice, I’m nowhere near as fast. I use a physical keyboard for any lengthy writing I do on my iPhone. However, within a month of getting my first iPhone, I was able to reach a point where I could answer text messages and do other short writing at a speed that didn’t leave people waiting for ages. I hope these explanations help you figure out which one of the three approaches will best suit your needs.
VoiceOver offers three different typing modes or styles. Apple has never settled on either of these terms, and uses them interchangeably in the documentation and in iOS itself. When you find them in the rotor, they’re referred to as “Typing Mode”. In VoiceOver settings, they are referred to as “Typing Style”. These styles are called Standard, Touch, and Direct Touch. Let’s discuss each of these in turn:
The style selected by default is the most friendly for beginners. Using this method, you are free to take all the time you need to feel around the keyboard until you find the character you wish to enter. Alternatively, you can flick left or right to move through the various keys in top to bottom left to right sequence. Once you have found the character you want, you can lift your finger and then tap twice quickly to enter that character. If you find that this requires too much coordination, you can perform what is known as a “split tap”. To accomplish this, keep your finger held down on the character you want. While doing this, use another finger to tap once anywhere on the screen. This will type the character, and it will be spoken, so you know that it was typed. For example, I might want to enter the letter “a”. Feeling around the left side of the keyboard, I find the “a” hearing it spoken. At that point, I lift my finger from the keyboard but keep its position above the “a” character. I then tap twice quickly to enter the “a”. Alternatively, I could use a split tap by keeping my finger on the “a” and using another finger, perhaps on my free hand, to tap anywhere else on the screen. I would then hear the “a” spoken as it was entered. This style offers the maximum level of forgiveness while people become used to typing on a keyboard modelled after the QWERTY keyboard but on a flat surface. You can take all the time you need and feel where keys are without worrying about accidentally entering anything. Nothing happens unless you use a double tap or split tap to firmly indicate your intentions. This includes the Delete and Return buttons, and the more Numbers, more Letters and more Symbols buttons which give access to those different sets of characters.
While very forgiving, many people eventually find that the Standard Typing style is cumbersome. The Touch Typing style offers the potential for much greater speed in exchange for some of the forgiveness offered by Standard Typing. Once you have developed enough of a mental map to know roughly where characters are on the screen, it normally doesn’t take much time to find the right one. Using the Touch Typing style, you can put your finger on the keyboard and then slide your finger around to find the correct key, keeping your finger on the screen. When you come to the right character, simply lift your finger, and it will be entered instantly. This style is the one I like best. I moved to it within months of getting my first iPhone. Just don’t leave your finger in one spot for more than a second or so. If you do, an alternative set of symbols will appear for you to select from. These symbols are ones that aren’t used quite as often, but are nonetheless handy. These include symbols for different currencies, accents, and many more. Provided you keep moving your finger, this switch won’t occur. The worst that can happen is that you’ll have to delete a symbol you didn’t actually want. You gain speed by not having to be as accurate, since you can quickly slide your finger to the correct character. You then quickly lift the finger to have the character entered and place it at or near the next needed character. I found this method far more intuitive and speedy for me. You don’t need to double tap or split tap all the time and can just zip along.
Direct touch typing
This style is for people with superb muscle memory and hand coordination. Essentially, VoiceOver gets out of the way, allowing you to simply touch characters and have them be entered instantly. This is precisely how sighted people would type on their devices. They can simply touch the desired keys to have the desired result happen. In effect, this typing mode offers maximal speed for people with the coordination and precision required to make use of it. However, as it is for people with sight, the tradeoff is absolutely no forgiveness for errant placement of one’s fingertip. Those who don’t have the needed hand coordination, steadiness, precise positional memory and awareness, will find Direct Touch typing an exercise in utter frustration. They will need to delete wrongly entered characters constantly.
obtaining feedback while typing
Most screen readers refer to this as “typing echo”. The basic idea is that when characters are entered, you can choose what kind of feedback you receive. There are four choices here. The first is “none”, to have no feedback at all. This is what I prefer when I use a reliable physical keyboard. Characters won’t be spoken when entered. If you’re using an onscreen keyboard, you will still hear the characters your finger moves over. The next choice is “characters”. This results in hearing characters when you find them onscreen, as well as hearing the same character spoken again when entered. This gives you confidence that you know precisely what ends up being entered rather than felt, so you know that your finger didn’t slip to a wrong character. The next choice is “words”. Rather than reading each character entered, you will instead hear words as they are completed by a space or punctuation mark. This gives you the confidence of hearing the words when they’re completed. It can be less annoying than hearing each character spoken twice. The final choice is “characters and words”. It speaks each character entered plus the words you complete, giving you maximum feedback. This could potentially be useful in situations where noise level or other distraction is high. You can select which kind of typing feedback you prefer in the VoiceOver settings. Simply double tap the choice you want, and it will be marked as selected. This is also where you will find the Typing Style setting. While the rotor may also be used to switch between typing styles, you cannot use it to select your preferred typing feedback. To accomplish this, you will always need to go into the full VoiceOver settings or use the quick settings introduced in iOS15. I find that while using the onscreen keyboard, the “characters” feedback choice works best for me. Should my finger slide while being lifted from the screen, I’ll know right away that a wrong character was entered. Meanwhile, if I’m typing on a physical keyboard that I’m familiar with and can fully trust, I tend to prefer “no feedback”. This lets me write the most efficiently.
One of the first things I heard about when people began telling me about VoiceOver was the rotor. I’m old enough to remember rotary dials on telephones I used as a child. That gave me an instant grasp of how this idea was supposed to work. A lot of things used to have knobs that you would turn to control them. In this era of flat touchscreens and buttons, people aren’t as familiar with turning dials and knobs. Especially younger people. I guess it’s perfectly possible to be in one’s twenties and never have had the experience of turning a dial or knob. For me, this concept was an absolutely delightful no-brainer. I took to it immediately. Your iOS experience will be infinitely better if you take the time to learn the VoiceOver rotor. Apple has used it to make navigation, and many other things, easier and more efficient for blind people. Imagine a knob sticking out from your screen. It can be anywhere on the screen that is most comfortable for you. The size of the knob is also whatever is comfortable for you, provided there’s at least a little space between the two fingertips you’ll use to turn the knob. Place the tips of two fingers on the screen as if you were gently pinching a small marble between them. You then turn that imaginary knob left or right with your fingers. This rotates through a circular menu of options that determine what flicking upward or downward with one finger will do. I got used to this within an hour of getting my iPhone, although some of the implications took a bit longer to find natural. To get used to this sort of thing, the easiest place to start is by adjusting slider controls for things you can immediately hear, like speaking rate or volume. For sliders like speech rate or volume, turn the rotor until you hear what you want to adjust. Once that’s done, merely flicking up or down would adjust the value. For instance, if you turn the rotor to Speaking Rate and flick up with one finger, the rate of speech will get faster. Once you’re happy with the speaking rate or other value you’ve adjusted, turn the rotor to a safe setting like Headings, Characters, or Words. This way, you won’t suddenly be surprised when the speed changes or the volume goes down if you happen to flick up or down accidentally. Remember that the rotor is always pointed at something. If you can’t remember what that is, simply turn it left or right, and it will announce what it points at. The rotor is absolutely never pointing at nothing. While the rotor is circular and will wrap around back to the first option if you keep turning it in the same direction, that isn’t always the case for flicking up or down. If your rotor is set to Speaking Rate and you flick down to 0%, flicking down again won’t shoot up to the top value. On the other hand, if you set the rotor to Language and flick up or down, it will wrap so that you’ll keep cycling through your language choices and don’t have to flick back through them. Something that really threw me for a loop at first was moving through documents. I was used to doing that with arrow keys, where you could move left or right when proceeding by character or word, and up or down to move by lines. Using an iOS device without a physical keyboard, you’d turn the rotor to Characters or Words and then move up or down to go backward or forward along the current line, respectively. Instead of holding down a key to move by words, the trick is to turn the rotor to the amount you want to move by, and then flick up or down once for each unit you intend to travel. That holds true for Lines and Headings, so it’s actually more consistent. Meanwhile, if you flick left or right while in an edit field, you move out of that field to the closest element in the rest of the app in sequence top to bottom and left to right. An edit field is a single element containing your document. Other elements may be options such as an Edit or Settings button or Document Export button. Some apps have a Word Count or other status information that may be treated as one or more display elements within an app. The rotor, therefore, helps you navigate within your document or within other single elements rather than moving to other parts of an app. There are a few different kinds of options that can be on the rotor. Before we go any farther, let’s go over the various types, so you have a sense of what you’re working with as you explore.
Navigational rotor options
Navigational options are on the rotor to help you move quickly through whatever information or controls you might need as you use your device. For instance, turning the rotor to the Words setting lets you move forward or backward by one word at a time. The Headings setting lets you jump back or forward through a document or web page by heading. This can be a very efficient way of moving through to areas of interest. A particularly noteworthy setting in the navigational category is the Vertical Navigation setting that lets you move directly up or down via upward or downward flicks. This can make reading maps and tables, and even examining game boards, easier. A good many of these options are particularly useful for browsing websites. Depending on how much control you want, removing some of these options can declutter your rotor. For instance, I like moving to the next or prior Link but don’t need to move to the next Non-Visited Link. Similarly, I don’t have the option to move to the next Text Field on my rotor, but appreciate the one that moves you to the next or prior Form Control. To make these movements, you need only flick up or down with one finger once you turn the rotor to the desired setting.
Sliders and Switches
Sliders and switches are similar rotor options. Simply turn to what you want to adjust and flick up or down. If you’re dealing with a slider, the value will increase if you flick upward or decrease if you flick downward. For example, turning the rotor to Speaking Rate and flicking downward will slow the rate of speech down. Flicking up would cause the rate of speech to increase. Sliders like Volume and Speaking Rate don’t rap around. When you try to move past 0% for Volume, it won’t suddenly fly up to 100% and yell at you. To gain volume back again, you’ll need to flick upward with one finger. Switches will allow you to turn a feature on or off. Unlike actual switches, flicking in the same direction will toggle the setting between on or off. I will often do this with the Audio Ducking option. This option determines whether any other audio, such as music you’re playing, automatically drops in volume while VoiceOver speaks. This is a very handy feature that makes certain that what VoiceOver says to you will never be drowned out by music or an action movie, etc. When I’m working, I quite frequently turn on Audio Ducking. I simply turn to the Audio Ducking setting and flick either up or down. Either direction will do, since all you’re doing is toggling between two selections, those being on or off. Other switches work the same way, such as those for VoiceOver Hints and for Sounds. This Sounds setting refers to VoiceOver-related audio cues. These short and relatively unobtrusive noises help indicate movement to new lines and other events without the need for spoken words that might break up the flow of information. I would advise beginners to leave both Audio Ducking and Sounds in the on position while they learn how to use VoiceOver. Both are very useful features and should only be turned off when there’s a good reason. Why, you might well wonder, would anyone want to turn off features like these? Sometimes, features meant to be helpful can get in your way when you’re trying to do things as rapidly as possible. You might not want to always hear hints once you’re a competent VoiceOver user. Here’s another example: With Audio Ducking, there are many occasions, like when listening to a show, where you really don’t want some random notification causing VoiceOver to prevent you from hearing what someone in the show says. That’s when it’s good to be able to turn the rotor to Audio Ducking and flick up or down to turn the feature off. This can also be helpful if you’re chatting with one or more people and don’t want to miss out on what they’re saying due to the volume of the conversation dropping whenever VoiceOver speaks.
Some rotor settings are special ones. For example, the Braille Screen Input setting will activate that method of input whenever the rotor is set there. You need not double tap to execute this choice. Merely turn the rotor to that setting, and you’re off and running. The same goes for the Handwriting setting that lets you write characters on the screen rather than typing them. To dismiss these input methods, you need only turn the rotor to another setting. If you turn the rotor to Actions, you uncover a menu of all actions you can take given the current element in focus. Flicking up or down moves through that context-sensitive menu, and double tapping on a choice executes it. If you move over an element where actions are available, the rotor will automatically turn to Actions. One example of this is when you delete a voice from your iOS device. That takes advantage of the Actions setting. Another place where you’ll find Actions is when you move to a song in the Music app. Flicking up or down will cycle through available Actions. This can sometimes be annoying when you’re moving over the editing field of a word-processing app. If you forget to double tap on that field, thereby entering editing mode, you will find a series of Actions rather than the words you might have expected when you flick up or down to navigate the document. It’s always a good idea to check different parts of an app if you hear VoiceOver say “actions available”. This is one way that VoiceOver lets blind people access things that might be visible on toolbars in an app’s display. Another is the Edit rotor setting. This makes it possible to Cut, Copy, and Paste text, as well as perform other tasks associated with editing. Similar to when using the ACTIONS menu, you need to double tap on the action you want to choose. In iOS 11, it became possible for developers to add custom rotor options that work within their apps. These could be used to speak information or make options more practical to use. For instance, in the BlindSquare GPS navigation app, you might find yourself missing a spoken message because you needed to pay attention to your surroundings. Being able to go over the last few spoken messages quickly could prove handy in keeping your bearings while travelling. Therefore, the developers of BlindSquare have added the Speech History rotor setting. Turning to this setting, you simply have to flick upward to hear prior spoken messages. If developers take the time to include such features, you will be able to find out about them in the help or other documentation provided for that specific app. As this capability was only recently introduced, awareness of it is still spreading among app developers. I look forward to experiencing the many ingenious uses developers come up with in the years ahead as more of them learn about the possibility of adding custom rotor options to their apps. There’s a great deal of untapped potential in this capability.
changing what’s on the Rotor
A lot of people greatly enjoy the rotor idea. I’m one of them, and my rotor has nearly every possible option on it. However, having too much on the rotor can be overwhelming for beginners. Thankfully, it is possible to change which tools are present in the rotor and decide the order that they’re positioned. This lets you find an order and set of features that best suits your style, needs, and competence. To customize the rotor, go into VoiceOver settings and then flick right until you come to Rotor. Double tap that button, and you’ll be in the right place. Once there, You’ll find a long list of settings with buttons that can be either selected or not. Selected items will appear on the rotor. All others won’t ever appear. To add or remove items from the rotor, you simply double tap a setting to select or deselect it. That lets you easily remove unwanted items from the rotor, reducing clutter. However, that still may mean many options to turn through to reach the one you want at a given moment. It certainly does in my case. Thankfully, it is possible to change the order of options on the rotor. One flick to the right of each setting is a Reorder button. To use these buttons, find the one to the right of the option you want to move. Next, double tap and hold your finger on the screen after the second tap. You can then drag the button up or down to change the placement of the item on the rotor. As you drag your finger, you will be told what it moves above or below. There will also be a small auditory beep and tactile vibration. You will be told what you’ve moved above or below. Just take your time and move your finger slowly while you get the hang of this.
How to Do It: Selecting Braille Screen Input
Braille Screen Input, or BSI for short, lets you position your fingers on the screen as if they were on the keys of a Braille writer and enter Braille characters. This can be easier to master for people proficient in Braille than the standard onscreen keyboard. What’s more, if this option is moved to the very top of the rotor, you will find that BSI will activate automatically whenever an edit field is entered. This can save you many twists of the rotor if you want to use it whenever possible. To demonstrate how to select and change the order of items in the rotor, we will go through the steps of setting this up. If you don’t want to actually use BSI, simply double tap the Braille Screen Input option until it doesn’t say “selected”. Presuming you’re already in the Rotor settings:
- Flick to the right repeatedly until you come to Braille Screen Input.
- Double tap on this button to enable this feature.
- Flick once to the right, placing you on the Reorder button matching the Braille Screen Input feature.
- Double tap quickly with one finger lower down on the screen. After the second tap, hold your finger down. Think of this like picking up a puzzle piece. As long as your finger remains on the screen, the choice will be attached to it as if your fingertip was magnetic.
- Slowly slide your finger up towards the top of the screen. You will hear a small click sound and feel a short vibration each time you move your finger far enough to pass an option. You will also be told what you move above as your finger slides up the screen.
- If you get to the top of the screen, lift your finger. You can then find the Reorder button for the Braille Screen Input option again and move it further up if necessary.
- Make certain that the Braille Screen Input option is at the very top of the list of rotor settings. There’s a heading at the very top of the list, so turning the rotor to Headings and flicking up will get you to the top quickly.
- Flick right over the options. The very first one should now be Braille Screen Input. This completes the process. From now on, the Braille keyboard will activate whenever you need to type something. If you prefer not to make use of the Braille keyboard during a particular occasion, you need only turn the rotor to another option, such as Characters or Words. The traditional keyboard will then be available.
Now, you should be able to move options around at will, as well as determine which of them will be on your rotor.
In iOS 15, Apple introduced an addition called quick settings which, I believe, will be helpful to all but especially to beginners who have yet to master the rotor. Think of it as a handy pocket where you can have settings for VoiceOver available anywhere you happen to be. This removes the need to lose your place in an app or home screen and go into VoiceOver settings in order to make changes to frequently altered settings. Selecting and arranging the settings you want to have in that handy pocket is done in the same way as for rotor settings. Simply go into VoiceOver settings, and then into “Quick settings”. Once there, you can double tap settings to include or exclude them from the settings found in the quick settings pocket. Any settings which say “selected” before their names will appear. Any that don’t won’t trouble you while in the quick settings pocket. There is no limit on how many settings can be in the pocket. Similar to rotor options, the order of settings can be rearranged. When on a setting, flick up or down with the rotor and you’ll come to options to move the setting you’re on up or down towards the start or the end of the collection. It’s that simple. None of the navigation settings are available in the quick settings choices. You can’t use it to change the rotor to characters or words. However, things like language, speaking rate, punctuation, typing feedback and typing mode are available. Beginners may find it useful to have those kind of settings available in the quick settings and remove them from the rotor. This would make it less likely that you would accidentally turn off hints or accidentally reduce speaking rate or volume. The rotor would be less cluttered with choices unless and until you reached a point where you wanted to have them on the rotor again. Absolutely nothing stops you from having settings available both on the rotor and in quick settings at the same time. The only consideration is how many turns or flicks you want to execute to reach the setting you want in each collection. Once you’ve added settings to the quick settings pocket, you get to it by a two finger quadruple tap. using two fingers at the same time, strike the touchscreen four times in quick succession. You will then find yourself in an area containing the settings you selected. There’s a search field and a filter to restrict the selection even further. otherwise, simply use a finger to flick right through the settings or left to go back over them towards the top. The settings are fully active so you can flick up or down on them to make adjustments as if you had selected them in the rotor.
Voices of Choice
Now that we’ve gotten you to this point, it’s time to let you in on something. You’re not stuck with the default voice that VoiceOver has used to speak to you with up to this point. Apple makes a small but growing collection of voices available, and you don’t need to pay for them. However, they do take up data storage space on your iOS device. Make certain you’re connected to Wi-Fi before downloading these voices. Even the smallest of them are over 100 megabytes (MB) in size. Some of the better voices are more than half a gigabyte. Alex, a high-quality voice made by Apple, is the largest voice available so far. It takes up over 800 MB. However, Alex is unique in that he simulates breathing as he reads. Many people, myself included, think he’s well worth his bulk. Others, including my wife Sara, think Alex is a bit creepy and prefer other voices. Thankfully, there are at least a couple of choices for each supported language. At a bare minimum, there will be a male and female voice. To download voices and add languages to your rotor, go into VoiceOver settings and then into Speech. Once there, flick right, and you’ll hear what the default voice is. You will also find numerous settings including Rate and Volume that you can adjust. Notice that the voice name is a button. Double tap that button, and you can change to a different default voice. The change would only take effect after the chosen voice finishes downloading. That’s also the case if you change the active voice for a language on the Language Rotor. It’s possible to have more voices on your iOS device if you wish. For instance, if you want a male and female voice for a given language, you can do so. However, only one can occupy the slot on the Language Rotor and be ready to be quickly brought into service.
How to Do It: Adding a New Language to the Language Rotor and Choosing a Voice:
Presuming you’re in the VoiceOver Speech Settings:
- Flick right until you reach the Add New Language button.
- Double tap this button with one finger. This takes you into a list of languages.
- Flick right through the languages until you hear one that you’re familiar with and want to try. Double tap this and VoiceOver should say “selected” whenever you flick over that particular language on the list. Note that a language may have one or more dialects. Each of these counts as a separate language and can have a slot on the Language Rotor. Thus, you can have a default voice, an English US voice, and an English UK, South African, and Irish voice.
- Once you’ve selected one or more languages, flick left with one finger or touch near the top left of the screen to reach the Back button. Double tap this with one finger.
- At this point, flicking right over the available options, you will find buttons for each language you have added. Double tap one of these buttons with a finger.
- Now you’re in the list of available voices for that language. Flick right over the choices and double tap the Download button of the one you’d like to try with one finger. The voice will then be downloaded.
- Flick left and double tap on the name of the voice with one finger after it has downloaded. That voice will now say “selected” and will be the active voice in that language slot.
- To try this new voice, turn the rotor to the Language setting.
- Flick up or down with one finger through the available options. There will be at least two, including the default slot and the language you’ve added. Continuing to flick in either direction will rap through any available options endlessly. You simply need to turn the rotor to something else after you are happy with the language selected.
Should space become tight, it is possible to delete extra voices. You simply go into their language in the VoiceOver Speech settings and flick down while on a voice’s name. You will eventually reach the Delete option and can then double tap with one finger to remove the voice from your device. Whenever a new version of iOS comes out, it’s a good idea to check in the Speech area for possible new voices in your language or dialect of choice. In iOS 12, there are new South African and Irish English voices. iOS 13 saw the addition of Indian voices. These voices are often added for use with Siri, but can also be used by VoiceOver.
Thanks to the rotor, selecting, copying, cutting and pasting text are all done easily. To use your rotor for this purpose, the Text Selection setting must be available on your rotor. It may not be there by default, so you may have to add it. We’ll go over adding, ordering, and removing items from your rotor later. Once you turn to the Text Selection setting, flicking up or down adjusts the unit of movement that the cursor will travel and select. To select text, flick right to expand the selected content by one unit of movement. Flick left to shrink the selected area by a unit of movement. You can change the unit of movement to fine-tune what is selected. For instance, you could select a few lines of text and then flick up to set the movement unit to Words so that you can then flick left to remove one or more words you didn’t want selected.
Let’s use a sentence as an example. “The quick brown fox jumped over the lazy dog.” To remove “The quick brown fox”, follow these steps:
- Turn the rotor to Characters or Words, and move to the start of the word “The”.
- Next, turn the rotor to Text Selection.
- Flick up or down to cycle through the units of movement until you come to Words.
- Flick right a few times, and you will hear “The quick brown fox” marked as selected.
- Find the Delete key near the bottom right of the virtual keyboard, or hit the Delete key on your physical keyboard if you’re using one. This will delete the selected text. Sticking with our example sentence, let’s tackle cutting and pasting text. We’ll pivot the two ends of the sentence around “jumped over”. When we’re done, “the lazy dog” will jump over “the quick brown fox”. Follow these steps:
- Select “The quick brown fox” precisely as we did above.
- Next, turn the rotor to the Edit setting. This gives you access to a menu of common editing actions.
- Flick up or down until you come to Cut, and double tap with one finger to perform the action. “The quick brown fox” is now removed from its customary home at the start of the sentence and is in your clipboard. This is a temporary storage place where things are put until they are copied or pasted elsewhere.
- Turn the rotor to Characters or Words, and flick down to get past “jumped over”. Hit the space key to leave a space after “over”.
- Turn the rotor until you reach Edit, and then flick down until you get to paste. Double tap with one finger to Paste “The quick brown fox” after “jumped over”.
- Next, turn the rotor to Characters or Words, and flick down over that “quick brown fox”. Get to the start of “the lazy dog”.
- Turn the rotor to Text Selection and set the movement unit to Word Selection. Flick right three times to select “the lazy dog”.
- Turn the rotor to edit and flick up or down until you reach Cut. Double tap this with one finger and “the lazy dog” will be hoisted into your clipboard.
- Turn the rotor to Characters or Words and move to before the words “jumped over”. Put a space before the “j” in “jumped” and flick up to move to before that space. This way, our last step will place “the lazy dog” perfectly in his new home at the start of the sentence.
- Turn the rotor to edit and flick up or down to find paste. Double tap this, and you’re all done. The spacing may not be perfect, but we’ve achieved our aim.
This works on whole blocks of text, allowing you to easily reorder chunks of the document you’re working on. It lets you quickly select precisely the text you want to share from an email. You could then use the text edit setting to cut that text to the clipboard, go to the email you’re composing, and paste the text from the other one right into it. It also works that way for web pages and anywhere else you’re allowed to copy or edit text. You can’t use it to copy a commercial ebook and paste the entire thing into a document, thereby defeating digital copy protection.
Written out as a list of steps, the text selection process sounds far less intuitive and natural than it will actually feel after you’ve used text selection a few times. It’s all turns and flicks. Part of my sense of how easy text selection and movement actually is now comes from my having to live through how nearly impossible the process was prior to Apple devising this system. For me, it has literally made all the difference. I began my iPhone journey, refusing to write anything on my iPhone longer than a few paragraphs. After text selection and manipulation became easy, thanks to the rotor, I wanted to use my iPhone for absolutely all of my writing. Once I found a good Bluetooth keyboard and the nifty word processor I’m now using, this became not only possible, but truly enjoyable. As long as you understand what the gestures do, you’ll marvel at how intuitive, precise, and quick the text selection process is.
There are many gestures that you will use to control your iOS device. These gestures will involve from one to four fingers. If you find it easier, you can perform these with fingers from each hand. Sensors in the surface of your device will detect the differences between finger movements. With more modern devices, beginning with iPhone 6S, this is taken even further with a concept called 3D Touch. Devices with this capability can sense how hard you press on a given app or option and act accordingly. Personally, I’ve found 3D Touch to be more trouble than it’s worth and don’t use it regularly. I’m apparently far from the only one to come to this conclusion. It appears that Apple plans to remove 3D Touch in future devices. If your device has this capability, you can enable and disable it in the Accessibility settings. Apple has made certain that 3D Touch will work in conjunction with VoiceOver, so you can certainly make use of it if you wish. Meanwhile, let us proceed with gestures specific to VoiceOver.
The touch is the most basic of gestures. It is accomplished by touching the surface of your device with a single finger. This announces the item you happen to place your finger upon. The item will be read aloud, but not activated. You can safely touch all over the screen to your heart’s content and develop a sense of where things are. This is true in all areas, including inside apps you may be using.
A tap is accomplished by quickly striking and then lifting one or more fingers from your device. There are taps using from one to four fingers, involving up to four strikes in quick succession. When using VoiceOver, there is no single-finger tap. This is regarded as a touch. A single tap with two fingers will cause VoiceOver to pause or resume speech. Very handy during book reading and on plenty of other occasions. A three-finger tap will read an item summary or report status information such as your position within a document. Finally, tapping with four fingers near the top or bottom of the screen will quickly move your cursor to the top or bottom, respectively.
The Double Tap:
This gesture comes in variations involving from one to four fingers. In all cases, the touchscreen is struck twice in quick succession. Presuming you haven’t changed the Double Tap Timeout setting, the default speed is like what one might imagine the speed of a stereotypical heartbeat or quick knocks on a door would be. The single-finger double tap, simply called the double tap, is used to activate things. It lets VoiceOver know that you intend to activate an item, rather than simply find out what it is. Touching a button called Play would merely cause VoiceOver to inform you of the button’s function. Performing a double tap on the button would cause it to activate, which would perhaps cause music to play. Double tapping on a button called OK would cause it to be pressed. Double tapping on a selected app on your home screens would cause the app to launch.
The Two Finger Double Tap
This variant of the double tap is sometimes called the “magic” tap. Its results will vary depending on the context in which it is used. Normally, it will cause the most recent app to have played audio to pause or resume playback. This is useful when busy in another app. Rather than having to switch into the app playing audio, pause it, and return to the email you didn’t quite catch, you can quickly do a two-finger double tap while in the email. This pauses the audio, making it easier to review whatever part of the email you didn’t quite catch without being distracted by the additional audio. Should you receive a phone call, the two-finger double tap can be used to answer it, and then hang up when you’re finished. It will also perform functions specified by app developers. For instance, the Twitter app uses this to give rapid access to tweet options such as reply and retweet, etc.
The Two Finger Double Tap and Hold:
This gesture involves making a two-finger double tap and then holding your fingers down after the second tap. Wait until a short tone sounds, and then lift your fingers off the screen. This puts you in a mode that makes it possible to label buttons and other improperly labelled parts of an app you want to use. This could make otherwise inaccessible apps become useable. If developers or sighted helpers inform blind people about what the element or button is supposed to be labelled as, this gesture lets you input those labels, so the app is more accessible.
The Three-Finger Double Tap:
This gesture will turn speech on or off. This is not the same as turning off VoiceOver. You may want to do this if you are making use of a Braille display. Braille is all handled through the VoiceOver feature. Therefore, if you prefer not to be interrupted by speech chattering away as you try to read lecture notes or a book in Braille, you should use this gesture to turn off speech. All VoiceOver gestures will still behave normally. Also, the small sounds VoiceOver uses to alert you to various things will continue to function. Simply perform another three-finger double tap to cause VoiceOver to resume speaking. Turning VoiceOver off and then back on will automatically have it resume the default behaviour of speaking.
Entering Quick Settings:
To reach a collection of VoiceOver settings you may want to adjust frequently, use a two finger quadruple tap. Take two fingers and tap them four times together rapidly on the touchscreen. You can then adjust these settings before returning to where you were via the “Done” button or the home gesture.
Entering Help Mode:
VoiceOver contains a Practice mode that allows you to become accustomed to performing various gestures without causing unintended consequences. You can enter this mode by employing a four-finger double tap. With four fingers spaced apart in a manner allowing the tips of all to touch the screen at once, strike the screen twice rapidly. VoiceOver should announce “starting help” followed by instructions on how you can exit this mode. This is done by the exact same gesture used to enter it. As I explain various gestures, you can enter this mode and master them. Should you wish to control your device with a Braille display, Bluetooth keyboard, or some other method, this mode will be helpful in determining which buttons or keys on your hardware perform various functions. With a Bluetooth keyboard, use the VoiceOver key and the letter “k”. The Escape key exits Help. On a Braille display, hold down the spacebar and hit the letter “k” to enter and exit Help.
The Triple Tap:
Tapping quickly three times with one or more fingers is called a triple tap. The one-finger triple tap is used to access context menus. There are all sorts of context menus. Performing a triple tap on the Twitterrific app, I find options to compose a tweet, edit the home screen, and other options. One option you will always find first in a context menu is the Dismiss Context Menu option. This returns you to whatever was present prior to you invoking the context menu. Try this out in various places where you suspect there may be available contextual options. One example of where this context menu access comes in handy is in the Ulysses app I used to write this guide. There are times when you want to search for all instances of a word or phrase and replace them with something different. I got into the habit of putting a hyphen between “double” and “tap”. Later, I learned that they were written as completely separate words. I therefore had to change all the hyphens in “double-tap” to spaces. To accomplish this, I invoked the Search function in Ulysses. I then entered “double” followed by the hyphen. Next, I moved to the Search button and flicked down on the rotor, reaching Access Search Options. Double tapping this, I flicked right and double tapped on Replace. This brought up a second edit field where I could enter the replacement text. I then felt around the bottom right of the screen to find the Replace button. On its own, it would replace one instance. I wanted to replace all instances. I therefore triple tapped on the Replace button, invoking its context option. Flicking right, I found and double tapped on Replace All, and my task was complete. I just hope I remembered to do this in all the sheets in the guide. This is the sort of thing you’ll want to check for in apps you use extensively. Over the years, I’ve developed a sense of where they might be found. The Replace All option in Ulysses was initially anything but obvious to me. However, Once I discovered it, I began exploring other apps mor thoroughly and found numerous options similarly situated. These tend to be options that are sometimes very useful, but not always needed. App developers want them handy, but don’t want them cluttering up the ordinary activity displayed while an app is used. It’s one of those conventions of iOS that can be less friendly to beginners who don’t think to seek these extra options.
This gesture isn’t specific to VoiceOver. To effect a swipe, you quickly move the appropriate number of fingers across the screen. Your fingers should be moving in a direction as they touch the screen so that a swipe is interpreted correctly. Think of it like brushing claws across the screen. A one finger swipe left or right will move to the previous or next item on the screen. The one finger upward and downward swipe moves you to the previous or next rotor item. This is whatever the rotor is set to such as character, word, etc. A two finger swipe upward will cause VoiceOver to read from the top of the screen or document. A two finger downward swipe is how to cause VoiceOver to “read all”. In other words, VoiceOver will read continuously down a screen or through a document until you stop it with a touch or pause it with a one finger double tap. The two finger left and right swipe gestures are used in the group navigation introduced in iOS 15. A two finger swipe to the left moves you out of a group. A right swipe moves you into a group. Three finger swipes are used to scroll in the opposite direction that you swipe. For example, a three finger swipe left will scroll to the right. If you were reading a book, think of a three finger swipe left like turning to the next page and a right swipe like turning back to the previous page. Similarly, three finger swipes up or down will scroll to the lower or higher screen area of a long document or block of information larger than what fits on your device’s screen. The four finger swipe to the left or right can be used to quickly move between various apps you have open. This causes a different app to be the one in focus and running in the foreground. This can be more convenient than using the app switcher. Especially if you remember how many swipes to do in order to reach a desired app. Upward or downward four finger swipes are unassigned by default. Using the “commands” section of VoiceOver settings, you could put these gestures to use in a great many ways.
The Item Chooser
Tapping three times with two fingers will cause you to enter the Item Chooser. This is a useful mode when navigating apps with a lot on their screens. It sets out all the interactive elements that you can then flick left and right through and examine, unhindered by the rest of what an app might be displaying. Double tapping on an item will select it. To exit the Item Chooser without making a selection, simply press the Home button if your device has one, or implement the appropriate gesture sliding your finger from the bottom and lifting it after the first beep.
Having control over your screen reader and understanding all related options is of paramount importance. To aid this process, we will discuss all the VoiceOver settings here rather than in the larger section devoted to what’s in the Settings app. You can always refer to this section if you need to understand something about VoiceOver. The first thing you come to in these settings is an item that announces that VoiceOver is on. This is a toggle. If you double tap it, you will shut VoiceOver off. Should you do this accidentally, don’t panic. Simply hold in the Home button until a beep is heard and say “turn on VoiceOver”. This will cause Siri to turn on VoiceOver. Presuming you have set up the Accessibility Shortcut, you can also simply press the Home button or action button three times rapidly if your device has no Home button. VoiceOver should turn on and announce this to you. Flicking right from this toggle, you will find some text giving very brief instructions for how to use VoiceOver. To the right of these lines of text, you will come to a button called practice. This will take you to an area where you can practice different gestures. You can also get to this practice area from anywhere using a four-finger double tap. The subsection immediately preceding this one contains detailed descriptions of important gestures. After reading about a particular gesture, you can go into the Practice mode, try it out, and then exit the Practice mode, returning to this document presuming you are reading it on your iOS device.
Continuing to flick right from the Practice button, you will hear the first setting, which is Speaking Rate. This is actually a heading. Once you have mastered the rotor, turn it to Headings and flick up or down to quickly reach this heading. In this instance, it is the only heading on the screen. While in a web browser, it is a fast way to navigate websites and other documents. In this settings screen, it is handy to quickly reach the first setting past the on/off toggle and introductory text, or to move upward from deep into the settings. Flicking right once more, you will come to a slider. You can adjust this by flicking up or down to increase or decrease the speaking rate of VoiceOver.
This group of settings lets you take control of which voice is used by VoiceOver, as well as other aspects of how it speaks. You can have one or more voices available on your device and add more than one language to your rotor. Each variant of English can have a slot on your Language rotor. You can then quickly switch between these differently accented voices without having to return to this section of Settings. To change the voice occupying a language slot, you will need to use this section. Voices take up space on your device, so you can remove ones you don’t want. Besides choosing which voices are available on your device and in your Language rotor, you can also change how certain words are pronounced. You do this through the Pronunciations button located in the Speech section. Once in the Pronunciations control, you’ll find a list, presuming you’ve added entries before. Alternatively, you will simply find an Add button that brings you to where you can enter words or phrases and how you want them pronounced. These entries can be case-sensitive or not as desired. They can also be only specific to a single app like a word processor, or voice-specific if desired. There’s a lot of room for flexibility here. Having grown up with synthetic speech, I don’t make use of this feature. However, those who have trouble understanding their voice of choice may find it very useful. This is especially true if you want names pronounced correctly. For the most part, I think VoiceOver does a good job, far exceeding what I remember from school days.
These particular settings are discussed in the separate section in this guide detailing Braille support. There are quite a lot of them. Braille support is very extensive and allows pretty much any app useable with VoiceOver to be used in Braille via a Bluetooth connected Braille display.
VoiceOver recognition settings
These settings let you take some control over which recognition capabilities you want to enable or disable in normal circumstances. You can also customize what these features announce or reveal through sound output. Image descriptions, text recognition and screen recognition all have the capability to be toggled on or off at a minimum. They also have other settings, such as the ability not to describe inappropriate images out loud automatically.
The next setting is actually a button that opens a small subcategory of settings that control how verbose VoiceOver is. This functions exactly like the button in Accessibility settings that takes you to the VoiceOver settings we are now investigating. Think of it like a big tree with branches. The Verbosity settings are a branch extending from VoiceOver settings. Within this branch, you’ll find a toggle setting for hints. While you’re still learning the ropes, you’ll want these to be enabled or on. There is also a toggle having to do with emojis. VoiceOver can speak the word “emoji” to make certain you know that you have encountered these pictures meant to convey emotional content. If this setting is disabled, you would merely hear the image description of the emoji. You’ll also find a setting indicating whether you want any detected text to be spoken. This can be handy if an item in focus isn’t properly labelled. There might be text on pictures that would also be spoken if this feature is enabled. You can also control the amount of punctuation spoken. Finally, you can determine how you want to handle any tables you might encounter. For now, that’s all you’ll find in this branch of VoiceOver settings. I wouldn’t be surprised if more settings are eventually encapsulated in here, though. To back out of an area or branch of a tree of options, like this Verbosity branch, touch the top-left area of the screen to find the Back button. Double tap this with one finger, and you’ll move back one level. You’ll be doing this often and will soon find your finger remembers precisely where to touch so as to hit the Back button and not the status bar immediately above it.
This small but important subsection of VoiceOver settings lets you control any sound-related aspects of VoiceOver’s operation. Besides speech, VoiceOver contains sound cues to help blind people follow what’s happening. You can turn these on and off here. Also, you can determine whether you want VoiceOver to speak through the left, right, or both channels. You can also do the same for sound effects so that it’s possible to have speech in one ear and sounds in the other if you wish. Another significant setting in this cluster is Audio Ducking. When enabled, any other audio such as music that is playing will have its volume lowered whenever VoiceOver is speaking. This way, you can be certain to hear everything said by VoiceOver. Another potentially useful setting found here lets you automatically switch calls to speakerphone if you aren’t holding your iPhone up to your ear. A new addition as of iOS 13 is the tactile bumps felt as you use VoiceOver. This is called haptic feedback, and you’ll find a Sounds & Haptics setting in the Audio branch of VoiceOver settings. You can choose which elements create bumps when interacted with, turning them on or off as desired. Also, you can select the strength of the bumps that you feel. Of course, it’s also possible to completely turn off this haptic feedback if you dislike it.
One of the most extensive and liberating additions for VoiceOver users in iOS 13 is the ability to fully customize the gestures and keystrokes that VoiceOver makes use of. This level of control was previously only available to users of Braille displays. However, it is now possible to assign and change any of the default gestures or keyboard commands Apple has chosen into ones that better suit your ability and style of use. I won’t go over each an every option here. Doing that could easily add fifty thousand words to this already large document, and there’s really no need. An overview of this branch of settings and a few examples should give you enough to make your own changes. This branch is divided into several sections. Each section has a button that brings you to it. The All Commands button lets you access the entire list. This list is also divided into areas of interest, each having its own button. Next, there are buttons to access all Touch Gestures and all Keyboard Shortcuts. This division is very useful for modifying the interface you’re using. However, I typically prefer going into the All Commands list. Past these buttons, you’ll also find buttons dealing with changing gestures used while using Braille Screen Input or while using Handwriting. Each of these special modes of input requires its own small set of gestures. These are treated differently while in these input modes. At the end of the Commands branch, you’ll find something vital. It’s the button that lets you reset to the original set of commands. I’m ever so thankful this has been added, as it’s easy to customize everything so much that other users won’t have a clue how to help you. Unfortunately, there’s currently no ability to save a configuration of customized commands. Should you need to use that reset option, all the changes and customizations you’ve made will be lost. While in one of these lists, you can edit the list, changing order of entries and even removing unwanted commands if desired. If you want to change an entry, double tap on it. You are then presented with a large list of options for the gesture or keyboard shortcut. The options are organized into sections under headings. This will let you quickly navigate to what you want a shortcut or gesture to do.
Changing the Rotor into Easy Swipes
Many people really struggle with the concept of turning the rotor. One way to make it easier is to change the feel of it entirely. Instead of a dial that turns and points at options, we can make it a menu that is swiped through with two fingers. Swiping left will get us to the previous rotor option, and swiping right will get us to the next rotor option. It becomes a series of pulldown menus we can swipe through. Frankly, I think Apple may have been better off using these two finger left and right swipes we are about to set up. This does conflict with gestures used in Braille screen input. It also conflicts with the new default assignments of these gestures to group navigation. You’ll need to find other gestures and make appropriate adjustments should you decide to follow these instructions. Let’s get started. To achieve this, we first need to find a suitable simple pair of complementary gestures that aren’t already used. To accomplish that, double tap on the Touch Gestures button. This presents you with a list of gestures starting with the single-finger tap and going all the way up through four-finger taps and swipes. You can flick right quickly through all the taps and reach the Two-Finger Swipe Left gesture. Double tap on this, and you’ll find a large list divided into headings. Turn the rotor to Headings, and flick down until you come to the Rotor heading. There, you’ll find buttons for Previous and Next Rotor. Double tap on the Previous Rotor button. You’ll be returned to the list of gestures. Flick left or right to find the Two-Finger Swipe Right gesture. Double tap on this, and then use the same method you did for finding the Previous Rotor command, but select the Next Rotor option instead. Double tap this, and you’ve done it. Now, you can swipe with two fingers left or right to cycle through the rotor options. As before, you still flick up or down with one finger to change a rotor option you are currently on or move by that increment through a website, document, or other text. This should feel a lot less strange for people who can’t cope with turning an imaginary knob. Sometimes, changes like this may conflict with other gestures. If you decide to use these two-finger swipes for your rotor, it will conflict with gestures used for Braille Screen Input. If you don’t plan to use this method of input, there’s no worry. However, if you do plan to use Braille Screen Input or Handwriting, you may want to stick with the original rotor turns or find alternative gestures.
Further Thoughts on VoiceOver Commands:
These are two very basic examples. There are endless possibilities for useful customization here. Perhaps, you’re used to a set of keyboard commands from a different screen reader and would like the keyboard commands for VoiceOver to be more like those commands. Perhaps, you find one or more of the gestures assigned by Apple to be less than ideal for your particular dexterity or hand coordination. Turning the rotor is a prime example of this. Now, you can sculpt the set of gestures or keyboard commands to be truly optimal for you. I hope the simple examples I’ve outlined above will be sufficient to drive you to explore this branch of VoiceOver settings and use what it offers. If worse comes to worst, remember that there’s always that Command Reset button. As you create or alter commands, keep symmetry in mind. In the example for previous and next rotor gestures, we used two-finger left and right swipes. Nothing prevents us from using a swipe to go one way and a tap to go another. However, doing this sort of thing can make it more likely that you’ll forget commands. The choices Apple makes in the default set of commands have been carefully thought through to be logical, symmetrical, and consistent. Don’t worry about accidentally overriding the function of a gesture or keyboard shortcut. If you pick a gesture or shortcut that is already being used, a dialogue will pop up to warn you. At that point, you can either cancel the change or reassign the gesture or keyboard command to your newly chosen purpose. Absolutely nothing is written in stone.
In addition to the ability to customize gestures and keyboard commands discussed above, Apple has also added the ability to customize how VoiceOver behaves based on the app you’re in or the kind of thing you’re trying to do. These two capabilities transform VoiceOver from a “my way or the highway” screen reader into an experience that is as flexible as other screen readers have been since the beginning. Perhaps, you want a slower voice that doesn’t speak out punctuation marks for use while reading a Kindle book. On the other hand, you want a faster voice that reads most or all punctuation marks when you’re writing or proofreading a document. Until iOS 13, you would have had to adjust everything manually each time you wanted to make a change. Activities settings can now perform these changes automatically or at your command through the ability to select Activities using the rotor. The area where you manage and create Activities is just past the Commands setting. Double tap on Activities, and you’ll be there. The first thing you’ll find is an Edit button. This lets you rearrange and remove Activities you’ve created. After that, you’ll find the programming example Activity Apple has created. This will give you an idea of what ingredients go into an Activity. Past that, you’ll find a very brief text explaining what Activities are. Last of all, there’s the Add Activity button. As you create Activities, they’ll be added to the list after the Edit button. You will also be able to select them using the Activities setting on the rotor. When you double tap on the Add Activity button, you are presented with an Activity Creation dialogue. The first thing on it is a text field for you to choose a name for that Activity. It would be a good idea to name activities based on the specific app or situation you want them to be triggered with. I write a lot, so we’ll create a group of settings ideal to my writing habits. I’ll name the Activity “Writing”. The first thing I come to is a heading called Speech Settings. I could select a different voice, but will leave it on the default setting. The speaking rate defaults to 50%, which is too slow for when I’m writing. I’ll set that to 65%. This takes some fiddling, since the slider moves by 10% increments. Dragging and sliding your finger can help with this, as can double tapping on the slider, which changes its value by 2% at a time. The volume is fine where it is. Verbosity settings come next. I like my Punctuation setting to “some” while writing. I don’t use emojis, so that’s set to “off”. The same goes for table information. Using a physical keyboard, I need not worry about Braille Settings. Flicking down to the next heading, I find Automatic Switching, which will cause the activity to activate when the correct app or situation is encountered. This is a context-specific activity I want to create, so I’ll double tap on Context. The very first Context option is Word Processing. I double tap on this, which selects it. I then need to use the Back button to leave the list of contexts and return to the Activity Creation dialogue. The final part of the dialogue lets you select different modifier keys for indicating VoiceOver control rather than the default ones. I never use the caps lock while writing, so I choose not to change that. Having gone through the full dialogue, I touch the top left of the touchscreen to find the Back button. There is no Save button. I am returned to the main ACTIVITIES dialogue and find that both the Programming and Writing activities are now present. There are around seven or eight different contexts such as Narrative or Spreadsheet, etc. that can be chosen from. Presumably, any app you have installed can be added to the Automatic Switching list to trigger the Activity setting to come into effect. A simple yet powerful tool for customizing how VoiceOver works in different situations. I wouldn’t be surprised if the number of options and behaviour possibilities increases in the Activities setting over time.
This lets you select between group or flat navigation. The flat method is what VoiceOver users are already familiar with. Elements are moved to from left to right, top to bottom. Group navigation offers a way to move between, into or out of groups of related items. The Weather app offers a good example of how useful this might be. Using group navigation, I can easily flick right over a list of groups and quickly reach the “rainfall” group. I then swipe right with two fingers to enter the “rainfall” group. Flicking right with one finger will then give me the information about any expected rainfall. I can then swipe left with two fingers to return to the list of groups. Using the classic flat navigation would involve a lot of scrolling and guesswork to reach the information I wanted. Grouped navigation can come in very handy at time. You can add the navigation style option to the quick settings or the rotor so you can turn it on or off as desired.
These settings allow you to customize the order and contents of the VoiceOver rotor. Simply double tap on features to add or remove them. To rearrange them, tap and hold your finger down on the Reorder button next to the feature you want to move. Keeping your finger on the screen, slide it up or down. You’ll hear a brief beep, and VoiceOver will tell you that you’ve moved the feature above or below others as you pass over them. That’s all there is to this section, but it’s enough to really make a big difference with your experience of the rotor.
The settings in this section let you adjust how VoiceOver interacts with the typing experience. There are a few different aspects to this that we’ll examine below: Typing Style
There are three typing styles or modes to choose from. The Default is called Standard Typing. In this mode, you can feel around and find the key you want to type in. Once you’ve found it, tap twice quickly on that spot. Alternatively, you can hold your finger on that key while tapping anywhere else on the screen with another finger. This is called a split tap. I personally find this method to be slow and cumbersome. However, it makes as certain as possible that what gets entered is what you intend. This is especially safe and useful for beginners or people with hand mobility challenges. Don’t worry about how long you hold down your finger on the key after you find it. You have all the time in the world. The next mode is called Touch Typing. In this mode, you can feel around to find the right key, and then simply lift your finger off the screen to enter the keystroke. I prefer this method as it speeds up the typing process. Don’t take too long raising your finger, though. Otherwise, you will find yourself entering an alternate symbol. These are less commonly used, but handy every once in a while. I find the time before alternates appear to be long enough, but beginners might feel a little rushed. Experiment with this typing mode in the Notes app, where it’s impossible to do anything too unexpected. The last typing mode is called Direct Touch Typing. It basically lets you type unaffected by VoiceOver. Keystrokes are entered the instant a key is touched, so you had better know exactly where you put your finger. This mode is for those who are blessed with good hand coordination and muscle memory.
This lets you hear phonetic words used to represent entered characters, like you hear over radio transmissions. For instance, “Alpha” for “A”, “Bravo” for “B”, “Charlie” for “C”, etc. You can choose to hear only the phonetic words or have the normal characters spoken followed immediately by the phonetic word. This might be useful in noisy environments when it would be easy to miss hearing just the typed character spoken. Choices here are Off, Character and Phonetics, or Phonetics Only.
This setting lets you pick what kind of feedback you want while typing. You can choose between Nothing, Characters, Words, or both Characters and Words. This last mode means you’ll hear each typed character spoken as well as the whole word when a space, return, or punctuation mark is inserted. These choices are the same for all software keyboards found on the screen of your device, with a separate set of these choices for physical hardware keyboards. Currently, I’m typing this on a mechanical keyboard I recently acquired for use while at home. It provides excellent tactile feedback. As I’ve come to trust that keystrokes are being properly registered, I’ve switched to no typing feedback while I’m using it. Meanwhile, for typing on my iPhone’s screen, I prefer to hear characters I type spoken so that I know they’re correct before proceeding to the next character.
This setting lets you choose which keys are used to indicate that you want to control VoiceOver rather than type something while using a physical keyboard. The Caps Lock key is my preferred modifier key. You can also use the Control and Option keys held down together. Nothing stops you from having both options selected.
Keyboard interaction time
This determines how long after you may continue to hold down a key before VoiceOver determines that you wish to access the Slide to Type feature or alternative options. The default delay is one second. Beginners who don’t often make use of alternative options may wish to increase this to two or more seconds so that they aren’t rushed while learning how to proficiently type on the touchscreen.
Always speak notifications
This setting tells VoiceOver whether you always want to hear new Notifications when they appear. This is the default behaviour. If you would rather not be informed, you can set this to off, and check the Notification Centre for any updates when it’s convenient to you. This might be preferred when reading a book or working, for instance. You can have Notifications generate sounds that are unique to a given app. This can be a good compromise as you’ll know from the sound that an update is available from a specific app without your workflow or reading being interrupted.
This setting lets VoiceOver move between images on a website. You can set it to only move to images with descriptive alt text and ignore ones without this information. Alternatively, you can have VoiceOver register all images regardless of whether they’re described. Finally, you could elect to have VoiceOver ignore all imagery in websites, skipping over them silently. Depending on what you’re doing, all of these modes can be useful options to have.
This cursor outlines whatever options are being spoken by VoiceOver. This makes no difference to totally blind people, but may be helpful to people with low vision or to sighted people trying to understand how VoiceOver works.
At times, it can be useful for sighted people to see what VoiceOver is speaking and when things are spoken. This can be useful when figuring out how to resolve issues with VoiceOver and to give sighted people a better sense of how VoiceOver works with apps and in various situations. This panel can either be on or off.
Double tap time out
This last setting allows you to specify the amount of time during which a second tap will count as a double tap. This can be extremely helpful for people who can’t move their fingers quickly and accurately enough to consistently achieve a double tap when they wish. You can set the number of milliseconds that works for you. There are buttons to increase and decrease the time. If you find that you’re registering double taps when you don’t intend to, try shortening this timeframe and see if that helps.
VoiceOver Key Commands
Rather than having all of the many key commands in this guide, I’ll direct those who are interested to Apple’s accessibility page at www.apple.com/accessibility. Go to the Vision page and then to Explore Options for Blind and Low Vision. There, you will find a section called Support, where you can get the full user guide for your product in tagged HTML, Apple Books, or even downloadable Braille. This will also make certain that you’re getting the most up-to-date list possible. Make use of these commands via the modifier key or keys you choose in VoiceOver settings. I prefer using the Caps Lock key as my modifier, but you may also use the control plus Option key combination. Those keys are at the very bottom of the keyboard, and that works better for some people. In the following list, I’ll refer to these modifiers as VO. You hold down that modifier key or keys, and then also press whatever additional key or keys will invoke the command. For example, to enter VoiceOver Help, press VO plus the letter “k”. That is, hold down the Caps Lock key or the Control and Option keys, and while holding that down, hit the letter “k”. To get people started off, here are some important basic key commands: Enter VoiceOver Help — Press VO plus letter k. Exit Help and return to previous screens — Press Escape key. Go to home screens — Press VO plus letter h. Move to status bar — Press VO plus letter m. Turn rotor left or right — Press VO plus Command plus Left or Right Arrow. Flick up or down on rotor — Press VO plus Command plus Up or Down Arrow. Open Notification Centre — Press VO plus FN and Up Arrow. Open Control Centre — Press VO plus FN plus Down Arrow. Open Item Chooser — Press VO plus letter i. One-finger double tap — Press VO plus Spacebar. Two-finger double tap — Press VO plus Hyphen or Dash. Swipe up or down — Press VO plus Up or Down Arrow. Quick Nav Mode
For physical keyboard users, Apple has devised a mode called Quick Nav that lets you do a lot of navigating web pages or other things using letters and arrow keys. To activate or deactivate this mode, press VO plus both the Left and Right Arrow keys. While in this mode, you can use the left and right arrows to quickly reach the next or previous element in an app. Use the up and down arrows to move to the next or previous item specified by the rotor. Use letters like “h” and “t” to reach the next heading or table within a website. To move backwards by these amounts, simply use the same letter while holding down the Shift key. It’s a really fast and simple way to move around. Just remember to turn off Quick Nav mode when you need to type in fields or edit documents. Be certain to look through Apple’s user guide, as there are numerous commands in this mode.
Keyboard Versus Touchscreen Gestures:
There are commands to do absolutely everything with a physical keyboard. I’ve just given you enough here to get started. Make use of the VoiceOver keyboard help and look through Apple’s user guide to learn about all the commands and get optimum use of your device and keyboard combined. Nothing says you need to choose between one or the other. I tend to use my keyboard mainly for typing, and simply reach over with one hand to my iPhone while working at my desk to use gestures. For me, that’s far more intuitive and efficient than memorizing all the key commands. If you do a lot of work while travelling, you might find that it’s worth learning all of these commands so that you can operate your device while only needing your keyboard to be on your lap while in a vehicle. Your iOS device can be safely packed out of sight while turned on. This can be particularly convenient with wireless Bluetooth headsets or AirPods. Compared to some other screen readers, there are fewer commands to learn about. Also, Apple has kept things consistent regarding groups of commands. Once you’ve learned enough of them, intuiting others gets easier.
VoiceOver has come a very long way in ten years. Back when I got my iPhone 4, VoiceOver had only been around for perhaps a year or so. Even then, I remember how delightful everything was once I had mastered the basics. Apps and websites have gotten more complex as time has passed. VoiceOver has also improved drastically over the same span of time. However, these improvements don’t ever seem to make the experience of using VoiceOver feel complicated. It still feels just as natural to me now as it did back then. The overall concepts and methods of interface have held up tremendously well. At this point, VoiceOver has really matured to where an iPhone or iPad can be nearly as easy to master and more comprehensive in the capabilities it offers than any product designed specifically for blind people. More than most screen readers, VoiceOver bridges the gap and shapes our experience of what it gives us access to. If you can understand the concepts I’ve gone over in this section, then you’ve managed to make the largest leap into radical territory necessary to get maximum benefit from your iOS device. Once you’ve mastered VoiceOver enough to competently use a touchscreen, the rest is just a matter of understanding apps, how they’re laid out, and their capabilities. It’s a good idea to take time getting a feel for how VoiceOver works and figuring out which settings will work best for you. This is the groundwork upon which all else builds. iOS 13 brought several new capabilities that have propelled VoiceOver forward by leaps and bounds. In addition to the Commands and Activities settings, there is also the increased ability to describe pictures and images. This can help make formerly inaccessible apps useable, even if not as elegant as the experience can be when developers actually put effort into polishing accessibility. iOS 14 and 15 have built further capabilities. Most notably, we’ve seen artificial intelligence unleashed to improve recognition of text, images, and elements of apps including controls. Big leaps forward like this take time to mature, and I trust we’ll see that happen in the next versions of iOS. We’ve come to a point where Apple needs to shore up VoiceOver and make certain that focus and basic functions work consistently. Charging ahead with these new ideas, they’ve left older problems unresolved. In the case of iOS 15, some have persisted until the very end of the cycle. It would be great to start out from a more polished place so people could appreciate what’s new without grappling with what should long since have been fixed. While Apple provides the accessibility tools, it’s up to individual app developers how much they use them, presuming they decide to at all. This can make a huge difference. For instance, the game Six Ages makes extensive use of the VoiceOver Hints system to deliver important game information to the player in a convenient way. This makes for a far more enjoyable experience. Adding special hints, image descriptions, and alternative interface elements like custom rotor options can turn an app from accessible but impractical to being outright delightful to use for blind people. Another obvious example is deciding what gets spoken automatically. Too much can be just as annoying as not enough. The user needs to take a bit more of a proactive role when it comes to exploring the screen and actively looking for things like buttons or elements. It’s a tradeoff between efficiency and awareness, which lets many apps be far more accessible and practical to use than they otherwise would be. Keep in mind that when all of this was initially conceived, there was no app store filled with hundreds of thousands of apps. VoiceOver has grown in capability in similar fashion to how iOS has expanded over the years. Thanks to Apple’s control of VoiceOver’s development, you can be certain that any app that comes pre-installed on your iOS device will have been made by Apple and will work with VoiceOver. Apple does a lot to encourage third-party developers to make their apps accessible; however, nothing really forces them to. Don’t be surprised if you end up getting an app that just won’t work with VoiceOver. Some apps are simply too visual to be made accessible. In other cases, developers genuinely aren’t aware that there are blind users. I can’t count the number of people I’ve met who have owned iPhones for years and are still blown away when they learn of all the accessibility tools they never knew existed. Once they decide to make their apps accessible, Apple provides ample resources that help them put VoiceOver and many other tools to maximum good use. There’s still a lot of room for growth with VoiceOver. I think the biggest bang for buck will come from increasing awareness about it among app developers and the wider public. The overall user experience hinges just as much on efforts of people working outside the walls of Apple as it does on Apple’s own considerable efforts to offer the best possible accessibility tools. I consider it very unfortunate that Apple has yet to create any kind of VoiceOver tutorial that comes included with iOS. Far too many blind owners of iOS devices who I’ve spoken with over the years simply didn’t know about often very basic capabilities of VoiceOver. A tutorial or user guide that people were made aware of right from their first experience would go a long way to help the situation. I’ve felt compelled to write this guide to address this. As more blind owners become aware of what is possible, I hope we see an increased willingness to explore and try new apps. I also hope blind people who find themselves frustrated that an app is inaccessible will take the time to contact the developer and express their interest in the app were it to become more accessible. If enough people take this step, many developers will take the time to improve the accessibility of their apps, leading to better experiences for everyone. Through VoiceOver, Apple has welcomed blind people into their space. As paying customers, I believe people have every right to complain about accessibility issues with apps and with VoiceOver itself. Through polite and constructive criticism, I believe we have a tremendous opportunity to make a great deal of positive change.