
Apple to bring Visual Intelligence feature to iPhone 15 Pro
What's the story
Apple has confirmed plans to bring its Visual Intelligence feature, a Google Lens-like tool, to the iPhone 15 Pro.
The announcement came during a chat with Daring Fireball's John Gruber.
The AI-based tool was first launched with the introduction of the iPhone 16 series in September.
It lets users explore their surroundings by simply pointing their phone's camera at different objects/landmarks.
User experience
Visual intelligence's functionality and accessibility
The Visual Intelligence feature, which is a component of Apple Intelligence, aims to identify a variety of items, from animals and plants to landmarks and businesses.
All you have to do is point your phone at the object you want to explore. The AI then gives you detailed information about the item.
On iPhone 16 models, you can long press the Camera Control button to either run a Google search or ask ChatGPT-specific questions about the object.
Feature activation
Activation on iPhone 15 Pro and 16E
The Visual Intelligence feature was first introduced via the Camera Control button on the iPhone 16 series.
However, Apple has now made it available on the iPhone 16E, which doesn't have this particular button. Instead, you can enable it through the Action Button.
This indicates that the feature could also work on other models such as the iPhone 15 Pro with an Action Button.
Apple has not yet revealed which update will bring this feature to iPhone 15 Pro users.