Apple’s visual intelligence may hint at future AR glasses development

Apple’s Visual Intelligence feature, showcased at the iPhone 16 event, allows users to scan their surroundings to identify objects and retrieve information. This feature could be a key step toward Apple’s much-anticipated AR glasses, enabling seamless real-time information gathering and integration with Apple’s app ecosystem.

Advertisement

Apple introduced a new feature dubbed “Visual Intelligence” at the iPhone 16 event, allowing users to scan their surroundings with the iPhone’s camera to identify objects, retrieve information from posters, and search for nearly anything around them. While this may appear to be a standalone feature, it could be a precursor to a much larger development: Apple’s long-rumored AR glasses.

Visual Intelligence’s ability to recognize and provide information on a variety of items in real-time is the type of technology that would be vital for any augmented reality device, especially smart glasses. For instance, Apple showcased how users could scan a restaurant with their iPhone to learn more about it. If integrated into AR glasses, this same feature could work seamlessly, allowing users to gather information without pulling out their phone — they would simply look at the restaurant and ask the glasses for details.

This concept has already been explored by Meta with its own computer glasses, proving that AI assistants in eyewear can be effective. However, Apple’s potential AR glasses would almost certainly feature a more refined design and user experience, integrating seamlessly with all other Apple apps and services.

Advertisement

Although Apple already has the Vision Pro, a headset equipped with multiple cameras, it’s not something that users would likely wear outside the home. As AR glasses are anticipated to be more lightweight and wearable in everyday settings, the development of Visual Intelligence feels like a move toward that goal. Reports have suggested that true Apple AR glasses may not hit the market until 2027 or later, but features like Visual Intelligence give Apple time to perfect the software.

While the timeline for Apple’s AR glasses remains uncertain, the company is clearly laying the groundwork. Visual Intelligence could eventually become the foundation of a killer app for AR devices. Just as Apple gradually developed AR capabilities in the iPhone before releasing the Vision Pro, it could be doing the same now — building the software for its glasses long before the hardware is ready.

The broader market is also looking toward AR glasses as the next big frontier in tech, with Meta, Snap, Google, and Qualcomm all investing heavily in the space. When Apple finally enters the market, Visual Intelligence will likely play a key role in its strategy to compete in the burgeoning AR landscape.