Apple unveils new tools for cognitive, speech, vision accessibility in its products

The tech giant introduced new tools for cognitive accessibility, along with Live Speech, Personal Voice, and Point and Speak in Magnifier in celebration of Global Accessibility Awareness Day on May 18.

New Delhi: Apple on Tuesday previewed new software features for cognitive, speech and vision accessibility, along with innovative tools for individuals who are nonspeaking or at risk of losing their ability to speak, that will be available in its devices later this year.

The tech giant introduced new tools for cognitive accessibility, along with Live Speech, Personal Voice, and Point and Speak in Magnifier in celebration of Global Accessibility Awareness Day on May 18.

“At Apple, we’ve always believed that the best technology is technology built for everyone,” said Tim Cook, Apple’s CEO.

“We’re excited to share incredible new features that build on our long history of making technology accessible, so that everyone has the opportunity to create, communicate, and do what they love,” Cook added.

With Live Speech on iPhone, iPad, and Mac, users can type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations.

Users can also save commonly used phrases to chime in quickly during lively conversation with family, friends, and colleagues.

For users at risk of losing their ability to speak — such as those with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other conditions that can progressively impact speaking ability — Personal Voice is a simple and secure way to create a voice that sounds like them.

Users can create a Personal Voice by reading along with a randomised set of text prompts to record 15 minutes of audio on iPhone or iPad, said Apple.

“These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives.

Assistive Access feature uses innovations in design to distill apps and experiences to their essential features in order to lighten cognitive load.

The feature includes a customised experience for Phone and FaceTime, which have been combined into a single Calls app, as well as Messages, Camera, Photos and Music.

It offers a distinct interface with high contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individual they support, according to the company.

Users and trusted supporters can also choose between a more visual, grid-based layout for their Home Screen and apps, or a row-based layout for users who prefer text.

Point and Speak in Magnifier feature makes it easier for users with vision disabilities to interact with physical objects that have several text labels.

“Point and Speak is built into the Magnifier app on iPhone and iPad, works great with VoiceOver, and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions to help users navigate their physical environment,” said Apple.

Voice Control feature adds phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound alike.

Additionally, with Voice Control Guide, users can learn tips and tricks about using voice commands as an alternative to touch and typing across iPhone, iPad, and Mac.

Users with physical and motor disabilities who use Switch Control can turn any switch into a virtual game controller to play their favourite games on iPhone and iPad, said Apple.

The SignTime feature will launch in Germany, Italy, Spain, and South Korea on May 18 to connect Apple Store and Apple Support customers with on-demand sign language interpreters.

The service is already available for customers in the US, Canada, the UK, France, Australia and Japan.

Back to top button