Ahead of its annual Worldwide Developers Conference (WWDC) in June, Apple has unveiled a suite of new accessibility features coming later this year, providing equal access for all users across its apps and devices.
“At Apple, accessibility is part of our DNA,” said Apple CEO Tim Cook in a statement. “Making technology for everyone is a priority for all of us, and we’re proud of the innovations we’re sharing this year. That includes tools to help people access crucial information, explore the world around them, and do what they love.”

Here’s a rundown of all the accessibility features being rolled out this year:
Accessibility Nutrition Labels on the App Store
Accessibility Nutrition Labels adds a new section to the App Store’s product pages, which highlights the various accessibility features that the app or game provides, such as voiceover, voice control, larger text, sufficient contrast or reduced motion. With this feature, users can tell at a glance if the particular app will be made accessible to their needs, with developers able to receive guidance on the criteria for displaying such accessibility information ahead of its worldwide launch on the App Store later this year.
Accessibility Reader
Arriving on iPhone, iPad, Mac and Apple Vision Pro, Accessibility Reader is a systemwide reading mode designed to make text easier to read for users with disabilities, presenting the option to customise text to suit each individual need, with a variety of fonts, colours and spacing options available. Accessibility Reader can be launched from any app, and will also feature support for Spoken Content.
Magnifier on Mac devices

Magnifier, a feature introduced in 2016 on the iPhone and iPad, will be heading to Mac devices. Designed to provide users who are blind or have low vision with the tools needed to enhance the readability of text or detect people and objects around them, Magnifier will connect to the camera on a user’s Mac device to zoom in on objects in their surroundings, and will also be integrated with Accessibility Reader to adjust real-world text to a user’s needs.
Braille Access
Users will be able to turn their devices into a full-featured braille note taker with Braille Access, an app with a built-in launcher that allows any app to be opened by typing with Braille Screen Input or a connected braille device. Users can then take notes in braille format and perform calculations using Nemeth Braille, or open Braille Ready Format (BRF) files directly from the app for easy access to books or previously taken notes. This feature also integrates a version of Live Captions that will allow users to transcribe conversations in real time on braille displays.
Apple Watch Live Captions
The Apple Watch will receive several features designed to assist those who are deaf or hard of hearing, including Live Listen controls and real-time Live Captions. Live Listen turns an iPhone into a remote microphone to stream content to a user’s AirPods, Beats headphones or Made for iPhone hearing aids, with users then able to use their Apple Watch to view Live Captions while listening to the audio, additionally able to start or stop these Live Listen sessions using the Apple Watch and removing the need to constantly retrieve their phones. Live Listen will also be compatible with the Hearing Aid feature on the AirPods Pro 2.
Apple Vision Pro visionOS enhancements

The Apple Vision Pro’s visionOS is set to receive expanded accessibility features for those who are blind or have low vision. These include updates to Zoom, which allow users to magnify objects in view using the device’s main camera, Live Recognition and VoiceOver to describe a user’s surroundings, find objects or read documents using on-device machine learning, as well as a new API for accessibilty developers to enable approved apps to access the main camera for visual interpretation in apps.
Alongside these new features, Apple will also roll out a variety of updates to its existing suite of accessibility settings, with a range of improvements including:
- Personalisable EQ settings for Background Sounds
- Improvements to Personal Voice using on-device machine learning and artificial intelligence
- The addition of Vehicle Motion Cues on Mac devices
- Improvements to Eye and Head Tracking
- Added support for Switch Control for iOS, iPadOS, and visionOS
- The addition of a simplified Apple TV media player using Assistive Access
- Expanded customisation of Music Haptics on iPhone
- Name Recognition feature added to Sound Recognition
- Added vocabulary syncing across devices and expanded language support for Voice Control
- Additional language support for Live Captions
- Updates to CarPlay, including support for Large Text and improved Sound Recognition
- Share Accessibility Settings feature to enable users to temporarily share their settings with another iPhone or iPad
With these features and improvements, users with special needs will be able to better access apps and devices across the Apple ecosystem. These features will be rolled out across the remainder of the year, with more information sure to come during Apple’s upcoming WWDC happening from 9 to 13 June 2025 at Apple Park in California.