On iPhone 16 models, Visual Intelligence lets you use the camera to learn more about places and objects around you. It can also summarize text, read text out loud, translate text, search Google for items, ask ChatGPT, and more. And thanks to iOS 18.4, currently in beta, iPhone 15 Pro models can get in on the action, too.
Apple has since confirmed that the same Visual Intelligence customization setting would be coming to iPhone 15 Pro models via a software update. That update is iOS 18.4, which is set to be released in early April and is available in public beta.
Maybe you've installed iOS 18.4 beta on your iPhone 15 Pro, or maybe you're waiting for the final release. Either way, in iOS 18.4 you can assign Visual Intelligence to the device's Action button in the following way.
Pressing and holding the Action button will now activate Visual Intelligence. Note that you can also activate Visual Intelligence using the new button option in Control Center. Here's how.
The Visual Intelligence interface features a view from the camera, a button to capture a photo, and dedicated "Ask" and "Search" buttons. Ask queries ChatGPT, and Search sends an image to Google Search.
To learn about everything that you can do with Visual Intelligence, be sure to check out our dedicated guide.
The above is the detailed content of How to Get Visual Intelligence on iPhone 15 Pro. For more information, please follow other related articles on the PHP Chinese website!