Apple launches new features Live Captions, Door Detection for iOS users

The Door Detection feature will be accessible through a new Detection Mode in Apple’s Magnifier app. People Detection and Image Descriptions will be two new capabilities that can function alone or in tandem with Door Detection to aid people who are visually impaired or have low vision.


Apple launched a set of accessibility features on Tuesday targeted at assisting people with disabilities. The new capabilities, which will be available later this year on the iPhone, Apple Watch, and Mac, are said to leverage hardware, software, and machine learning breakthroughs to assist those with low vision, vision impairment, and physical or movement limitations. Door Detection for iPhone and iPad users, Apple Watch Mirroring, and live captioning are among the highlights. VoiceOver has been updated with 20 new regions and languages, according to Apple.

Door Detection, which employs the LiDAR sensor on the latest iPhone or iPad models to help users navigate to a door, is one of the most useful accessibility tools that Apple released as part of its recent updates. According to the business. The function employs a combination of LiDAR, camera, and on-device machine learning. To determine how far users are from the door. And characterize its properties, such as whether it is open or closed.

Apple launches new features 

The Door Detection feature can assist individuals in opening a closed door. By pushing, turning a knob, or pulling a handle. It’s also said to be able to read signs and symbols near the door. Such as the room number, and even recognize the existence of an accessible entrance signal.

The pre-installed Magnifier software will enable the Door Detection feature. Which will operate with the iPhone 13 Pro, iPhone 13 Pro Max. iPhone 12 Pro. iPhone 12 Pro Max, iPad Pro 11-inch (2020), iPad Pro 11-inch (2021). And iPad Pro 12.9-inch (2020) and iPad Pro 12.9-inch (2021).

The Door Detection feature will be accessible through a new Detection Mode in Apple’s Magnifier app. People Detection and Image Descriptions will be two new capabilities. That can function alone or in tandem with Door Detection. To aid people who are visually impaired or have low vision.

Apple Maps will have audible and tactile feedback for users who have enabled VoiceOver to help them identify the beginning point for walking direction, in addition to the Magnifier enhancements, the company revealed.

Apple unveils Live Captions

It unveiled Live Captions on the iPhone, iPad, and Mac for deaf users and those with hearing difficulties. It will be available in English in beta later this year for iPhone 11 and later, iPad models with A12 Bionic and later, and Macs with Apple silicon users in the United States and Canada.

According to the company, Live Captions will function with any audio content. Including phone and FaceTime calls, video conferencing or social media apps. Streaming media content, and even when users are conversing with someone nearby.

Users can change the text size to make it easier to read. FaceTime will now assign auto-transcribed language to call participants. Making it easier for users with hearing impairments to converse with one another during video chats.

Apple ‘s new features Live Captions

According to Apple, Live Captions on Mac will include the ability to enter a response and have it read out in real time to other participants in the conversation. It also stated that Live Captions would be generated on the device, with user privacy and security in mind.

Speak Selection and Speak Screen will be able to use the new languages, locations, and voices. Additionally, VoiceOver for Mac will function in conjunction. With the new Text Checker tool to correct formatting issues such as duplicated spaces and misplaced capital letters.

New themes and customisation options, such as bolding text and altering line, character, and word spacing, will be included in the preloaded Apple Books app to provide consumers with a more accessible reading experience. In addition, starting this week, a new Accessibility Assistant shortcut in the Shortcuts app for Mac and Apple Watch will help recommend accessibility features based on user preferences.

Apple upgrades

Park Access for All, a new resource from the National Park Foundation, will be available on Apple Maps to assist people locate accessible attractions, programmes, and services to explore in parks around the United States. It will also recognise companies and organisations who cherish, embrace, and prioritise the Deaf population and sign languages.

Users can also find accessibility-focused apps and tales from developers in the App Store, as well as stories by and about people with disabilities in Apple Books’ Transforming Our World collection. The Saylists playlists, which each focus on a distinct sound, will also be highlighted on Apple Music.