Cover Image for The Ray-Ban Meta smart glasses incorporate visual artificial intelligence and real-time translation.
Thu Dec 19 2024

The Ray-Ban Meta smart glasses incorporate visual artificial intelligence and real-time translation.

Two anticipated artificial intelligence features will be available on the Ray-Ban Meta smart glasses in an upcoming update.

Meta is introducing two highly anticipated features in its popular Ray-Ban smart glasses: real-time visual artificial intelligence and translation. Currently, these features are in the testing phase, but it is expected that all users of the Ray-Ban Meta Smart Glasses will have access to a live assistant that can see, hear, and translate Spanish, French, and Italian. This launch is part of the v11 update, which also includes the integration of Shazam for music recognition.

The glasses employ built-in cameras, speakers, and microphones, eliminating the need to use a mobile phone for these functions. In past experiences, activating the AI assistant required a keyword, which could be cumbersome during communication. Now, the new real-time AI provides a smart assistant that is always available, facilitating interaction and speeding up responses.

The novelty of this live AI feature is based on a continuous visual stream through video, although privacy is not compromised since these features must be activated by the user in the glasses' settings. Additionally, there is an option to disable these functions at any time. This update is being tested first with early access users and is expected to expand to more owners in the coming weeks or months, depending on feedback.

The Ray-Ban Meta Smart Glasses, which resemble ordinary sunglasses, are priced at $299 and continue to improve through over-the-air updates. While there is increasing competition in the sector, these devices are the first to capture the interest of the mass market, standing out for their value and functionality.