Apple is developing(Opens in a new window) a bunch of new accessibility features, including the ability to use an iPhone to detect a door.
The Door Detection feature is designed for people who are blind or have low vision and need help navigating the last few feet before they arrive at a destination. The technology taps the iPhone’s camera, LiDAR scanner, and machine-learning algorithms to discern the various attributes to a door ahead of the user, including if it's open or closed and how it can be opened.
The feature is also smart enough to read the information around a door, such as the room number or if there’s an accessible entrance symbol.
As an example, the company uploaded a video showing Door Detection identifying the distance to a door and reading out the text on the door's surface.
Apple is adding this feature in Magnifier, a built-in iPhone app that already contains a “People Detection” and “Image Description” function to help blind users better understand their surroundings. However, the Door Detection option will only be available for the iPhone 13 Pro, iPhone 13 Pro Max, iPhone 12 Pro, iPhone 12 Pro Max, along with newer iPad Pro models.
Another upcoming accessibility feature that promises to be useful to all users is live captioning, which is coming to the iPhone, iPad and Mac. This means users will be able to see live captions across various apps, including during a FaceTime call or in a video-conferencing session. The feature can also be used to live caption a conversation with a real person next to the phone.
In addition, the live captions are generated on the device. So no information about the conversation is shared with Apple servers. But the feature will only be available on the iPhone 11 or
Read more on pcmag.com