From dark mode in iOS 13 to a redesigned user interface in tvOS to the dismantling of iTunes to the coming of iPadOS, Apple made a slew of announcements at its Worldwide Developers Conference keynote on Monday in San Jose. And accessibility was there in full force.

Accessibility, as it always does, plays a significant role in not only the conference itself — the sessions, labs and get-togethers all are mainstays of the week — but also in the software Apple shows off. Of particular interest this year is Apple’s Voice Control feature, available for macOS Catalina and iOS 13 devices, which allows users to control their Macs and iPhones using only the sound of their voices. In fact, it’s so compelling Apple decided to make it a banner feature worthy of precious slide space during Craig Federighi’s onstage presentation.

After the keynote concluded, I had an opportunity to sit down with Sarah Herrlinger, director of Global Accessibility Policy & Initiatives at Apple, to talk more in-depth about Voice Control, as well as some other notable accessibility features coming to Apple’s platforms in 2019.

“One of the things that’s been really cool this year is the [accessibility] team has been firing on [all] cylinders across the board,“ Herrlinger said. “There’s something in each operating system and things for a lot of different types of use cases.”

Hello, computer

Although much of the conversation around what Apple announced revolves around iPadOS and Project Catalyst, based on what I’m hearing on podcasts and seeing in my Twitter timeline, Voice Control definitely is a crown jewel too. Nearly everyone has praised not only the engineering that went into developing it, but also the fact that Apple continues to lead the industry at making accessibility a first-class citizen. Myke Hurley said it best on the Upgrade podcast following the event, the weekly show he co-hosts with Jason Snell, when he said Voice Control is something Apple doesn’t have to do. They do it, he said, because it’s the right thing to do for every user. Put another way, Apple works so tirelessly on accessibility not for “the bloody ROI,” to paraphrase Tim Cook.

Sarah Herrlinger, director of Global Accessibility Policy & Initiatives

Herrlinger demoed Voice Control to me, and it works as advertised despite our setting containing a lot of ambient noise. The gist of it is simple enough: You give your MacBook or iMac commands, such as “Open Mail” or “Tell Mary ‘Happy Birthday’ ” in Messages. Beyond the basic syntax, however, there are elements of Voice Control that make dictating to your Mac (or iOS device) easier. For example, Herrlinger explained how you can say “show numbers” in Safari’s Favorites view and little numbers, corresponding to the number of favorites you have, show up beside a website’s favicon. Say TechCrunch is No. 2 in your list of favorites. If the glyph is hard to make out visually, saying “open 2” will prompt Voice Control to launch TechCrunch’s page. Likewise, you can say “show grid” and a grid will appear so you perform actions such as clicking, tapping or pinching-and-zooming.

For many disabled people, the floodgates just opened. It’s a big deal.

Herrlinger told me Voice Control, while conceptually fairly straightforward, is designed in such a way to be deep and customizable. Furthermore, Herrlinger added that Apple has put in a ton of work to improve the speech detection system so that it can more adeptly parse users with different types of speech, such as those who stutter. Over time, Voice Control should improve at this.

Of course, the reason for all the excitement over Voice Control is the way it makes computing more accessible. Which is to say, Apple has reached an inflection point with its assistive technologies where someone who can’t physically interact with their computers now has an outlet. To use only your voice to do this used to be the stuff of science fiction, but now it’s more or less reality. There are other tools, like Apple’s own Switch Control, that are in the ballpark, but Voice Control takes it to a whole other level. Apple is putting a stake in the ground — if you can’t touch your computer, just talk to it. For many disabled people, the floodgates just opened. It’s a big deal.

Hover Text is Dynamic Type reimagined

I’ve made my affection for iOS’s Dynamic Type feature known countless times. By the same token, I’ve made my displeasure of its absence on macOS known just as often. Apple heard me.

Another feature Herrlinger was keen to show me was something Apple is calling Hover Text, on macOS. A subset of the already present Zoom functionality, Hover Text sort of reminds me of tooltips in Windows. The “Hover” name stems from the function: place your mouse pointer over a selection of text and you get a bubble with said text enlarged.

Herrlinger told me the feature works system-wide, even in places like the menu bar. And yes, Hover Text is indeed customizable; users have access to a wide variety of fonts and colors to make Hover Text’s “bubbles” their own. Text size can be enlarged up to 128pt, Herrlinger said. What this means is users can play with different permutations of the feature to find which label(s) work best — say, a yellow background with dark blue text set in Helvetica for the highest contrast. The possibilities are virtually endless, a testament to how rich the feature is despite its simplicity.

At a high level, Hover Text strikes me as very much the spirit animal of my beloved Dynamic Type. They’re clearly different features, with clearly defined purposes, but both strive to achieve the same goal in their own ways. Herrlinger told me Apple strives to create software solutions that make sense for the respective platform and the company’s accessibility group believes Hover Text is a shining example. They could’ve, she told me, ported Dynamic Type to the Mac, but found Hover Text accomplished the same goal (enlarging text) in a manner that felt uniquely suited to the operating system.

iOS gains Pointer Support, sort of

As first spotted by the ever-intrepid, master spelunker Steve Troughton-Smith, iOS 13 includes pointer support — as an accessibility feature.

Mouse support lives in the AssistiveTouch menu, the suite of options designed for users with physical motor delays who can’t easily interact with the touchscreen itself. Apple says it works with both USB and Bluetooth mice, although the company doesn’t yet have an official compatibility list. It’s telling how mouse functionality is purposely included as an accessibility feature — meaning, Apple obviously sees its primary value as a discrete assistive tool. Of course, accessibility features have far greater relevance than simply bespoke tools for disabled people. Just look at Troughton-Smith’s tweet for proof.

Still, in my conversation with Herrlinger, she emphasized the point that Apple built pointer support into AssistiveTouch as a feature designed and developed with accessibility in mind. In other words, support for mice and external pointing devices are intended expressly for accessibility’s sake. As usual with Apple products, Herrlinger told me the foundational parts of pointer support date back “a couple years.” This is something they’ve been working on for some time.

Accessibility features can benefit more than the original community they were designed to support.

To Apple, Herrlinger said, pointer support — which is supported on both iOS 13 and iPadOS — is a feature they felt needed to exist because the accessibility team recognized the need for it. There’s a whole class of users, she told me, who literally cannot access their devices without some other device, like a mouse or joystick. Hence, the team embarked on their mission to accommodate those users. When I asked why build pointer support into a touch-based operating system, Herrlinger was unequivocal in her answer: it serves a need in the accessibility community. “This is not your old desktop cursor as the primary input method,” she said.

The reality is, it’s not your secondary choice, either. The bottom line is that, while Apple loves the idea of accessibility features being adopted by the mainstream, pointer support in iOS 13 and iPadOS really isn’t the conventional PC input mechanism at all. In this case, it’s a niche feature that should suit a niche use case; it’s not supposed to represent the milestone of iPad’s productivity growth that many think it could be. Maybe that changes over time, but for now, it’s the new Mac Pro of software: not for everyone, not even for most people.

That said, a crucial point should be made here: people without disabilities will use this feature, regardless of its actual intended utility, and Apple recognizes that. No one will stop you from plugging a mouse into your iPad Pro. It’s no different from someone using Magnifier to get up close on a finely printed restaurant menu or using Type to Siri in order to quietly give commands in a Messages-like environment.

“Accessibility features can benefit more than the original community they were designed to support,” Herrlinger said. “For example, many people find value in closed captions. Our goal is to engineer for specific use cases so that we continue to bring the power of our devices to more people.”

It’s important, though, to take this feature in context. Users should be cognizant of the fact this implementation of pointer support is not meant to drastically alter the primary user input landscape of iPad in any way. That is the broader point Apple is trying to make here, and it’s a good one.