At Apple’s recent iPhone 16 event, the company unveiled its new “Visual Intelligence” feature. This tool allows users to scan their surroundings via the iPhone’s camera to identify objects quickly, fetch details from posters, or look up just about anything around them. While it’s an incredibly useful feature for the iPhone, it also hints at Apple’s future ambitions, particularly its long-rumored AR glasses.

Imagine just glancing at a restaurant and having your glasses provide you with information, without needing to pull out your phone. That’s the potential of Visual Intelligence integrated into AR glasses—technology Apple seems to be laying the groundwork for with this new feature.

Apple isn’t the first to explore this idea. Meta, with its computer glasses, has already shown how wearable tech with AI assistants can identify and provide information on demand. However, Apple’s approach would likely offer seamless integration with its ecosystem of apps and services, elevating the experience.

Although Apple’s Vision Pro is the company’s most advanced AR/VR headset, it’s not meant for everyday outdoor use. However, lightweight AR glasses could provide users with an everyday wearable device that’s both functional and stylish. Reports suggest Apple’s true AR glasses might not arrive until 2027, but Visual Intelligence is giving the company a head start on refining the software necessary for that future.

For example, Apple spent years building AR features for iPhones before launching Vision Pro, so visual intelligence might be the first step in creating the perfect software foundation for AR glasses. With tech companies like Meta, Snap, and Google already heavily invested in AR glasses, Apple will likely use Visual Intelligence to stand out in the growing market when the time is right.

It’s not a question of if Apple will release AR glasses, but when—and Visual Intelligence could be the key to unlocking that future.

editor11122

editor11122