-->

Apple Adds Voice-Based Accessibility Features

Article Featured Image

Apple this week previewed several accessibility features, including Vocal Shortcuts and additional voice control and captioning features, as part of its latest software releases slated to drop later this year.

"We believe deeply in the transformative power of innovation to enrich lives," said Tim Cook, Apple's CEO, in a statement. "That's why for nearly 40 years Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We're continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users."

"Each year, we break new ground when it comes to accessibility," said Sarah Herrlinger, Apple's senior director of global accessibility policy and initiatives, in a statement. "These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world."

With Vocal Shortcuts, iPhone and iPad users can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks.

Listen for Atypical Speech, another new feature, enables users to enhance sspeech recognition for a wider range of speech. Listen for Atypical Speech uses on-device machine learning to recognize user speech patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customization and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.

"Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers," said Mark Hasegawa-Johnson, principal investigator of the Speech Accessibility Project, in a statement. "The Speech Accessibility Project was designed as a broad-based, community-supported effort to help companies and universities make speech recognition more robust and effective, and Apple is among the accessibility advocates who made the Speech Accessibility Project possible."

A feature coming to CarPlay is Voice Control, which allows users to navigate CarPlay and control apps with just their voices.

Accessibility features coming to visionOS will include systemwide Live Captions to help users who are deaf or hearing impaired follow along with spoken dialogue in live conversations and audio from apps. With Live Captions for FaceTime in visionOS, more users can connect and collaborate using their Personas.

Additional voice technology improvements include the following:

  • For users who are blind or have low vision, VoiceOver will include new voices, a flexible Voice Rotor, custom volume control, and the ability to customize VoiceOver keyboard shortcuts on Mac.
  • For users at risk of losing their ability to speak, Personal Voice will be available in Mandarin Chinese. Users who have difficulty pronouncing or reading full sentences will be able to create a Personal Voice using shortened phrases.
  • For users who are non-speaking, Live Speech will include categories and simultaneous compatibility with Live Captions.
  • Voice Control will offer support for custom vocabularies and complex words.

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues