Cover Image for Meta's smart glasses can now tell you where you parked your car.
Thu Oct 03 2024

Meta's smart glasses can now tell you where you parked your car.

Meta is rolling out some of the previously announced features for its AI-powered Ray-Ban smart glasses, targeted at users in the United States and Canada.

Meta has begun rolling out some of the previously announced features for its AI-driven Ray-Ban smart glasses, targeting users in the United States and Canada. Chief Technology Officer Andrew Bosworth reported on Threads that today’s update improves natural language recognition, meaning that elongated commands such as "Hey Meta, look and tell me" are now a thing of the past. Users will be able to interact with the AI assistant without the need to include the "look and" part in their commands.

Most of the AI tools presented during last month's Connect event are also coming to the glasses this week. These features include voice messages, timers, and reminders. The glasses will also be able to use Meta AI to make calls to phone numbers or scan QR codes. CEO Mark Zuckerberg showcased the new reminder functions in an Instagram video, demonstrating how to locate a car in a garage.

A notable absence in this update is the live translation feature, although Bosworth did not provide a timeline for when it will be available. Meta's smart glasses had already grabbed attention earlier in the day when two Harvard students used them to present information about strangers. Thanks to a combination of facial recognition technology and a large language processing model, they managed to reveal addresses, phone numbers, details about relatives, and parts of Social Security numbers.