Exciting Update for Meta Ray-Ban Smart Glasses: Users Can Now Ask Questions Based on What They See

7 months ago 1878

The Meta Ray-Ban Smart Glasses, introduced last year, are set to receive an update next month, enhancing their functionality by allowing users to inquire about their surroundings. While the glasses already integrated AI capabilities, image recognition was previously unavailable.

According to a report by The New York Times, this upcoming update has been undergoing testing by select users since December and is slated for release in the stable version next month. Leveraging the glasses' built-in camera and advanced image recognition technology, users will now be able to pose questions about various objects and scenes they encounter. This multimodal Meta-model is designed to provide responses to inquiries ranging from identifying buildings and animals to deciphering food packaging.

Despite Meta's earlier promises regarding the integration of image recognition, this feature was absent until now. Nevertheless, users were able to access the digital assistant, Meta AI, which is also utilized in other Meta products like Quest headsets and popular apps such as WhatsApp and Instagram. Interaction with the AI assistant is initiated by simply uttering "hey Meta" into the glasses, after which the AI can deliver responses through the built-in speaker.

This update marks a significant advancement in the capabilities of the Meta Ray-Ban Smart Glasses, elevating them from mere wearable devices to intelligent tools capable of understanding and responding to users' inquiries about their surroundings. With the integration of image recognition, these smart glasses offer enhanced utility and convenience, further solidifying their position as a pioneering product in the wearable technology market.