Meta updates smart glasses with conversation focus and Spotify AI

Meta is rolling out a software update for its smart glasses that introduces Conversation Focus, a feature to amplify voices in noisy environments. The update also adds AI-powered Spotify integration for context-based playlists. These enhancements are available first to early access users on Ray-Ban and Oakley models.

Meta announced a year-end software update for its lineup of smart glasses, fulfilling a promise made in September at the Meta Connect developer conference. The key addition is Conversation Focus, which uses the glasses' beam-forming microphones to directionally amplify the voice of the person in front of the wearer while filtering out background noise in crowded or noisy settings. "You’ll hear the amplified voice sound slightly brighter, which will help you distinguish the conversation from ambient background noise," Meta explains. Users can activate it with voice commands like "hey Meta, start Conversation Focus" or by setting a tap-and-hold shortcut.

The V21 update also introduces a multimodal AI feature in partnership with Spotify. By saying “hey Meta, play a song to match this view,” the glasses analyze what the user is looking at—such as holiday decorations—and generate a personalized playlist based on the user's taste and the scene. This capability draws comparisons to upcoming features from Google for its glasses next year.

The rollout targets Meta Ray-Ban glasses (both first- and second-generation models) and Oakley Meta HSTN frames, starting with early access participants before gradually reaching all users. For the Oakley Meta Vanguard shades, aimed at athletes, the update enables single-word voice commands like "photo" to snap a picture or "video" to start recording, helping users "save some breath" during runs or bike rides. Additional features include voice command shortcuts and programmable workouts integrated with Garmin watches.

Similar to FDA-approved hearing assistance glasses from Nuance Audio, Conversation Focus aims to improve accessibility in social situations. The firmware updates may take time to propagate, potentially delaying availability for some users.

Verwandte Artikel

Realistic illustration of Ray-Ban Meta smart glasses amid a class-action lawsuit over privacy breaches, featuring courtroom elements and data review imagery.
Bild generiert von KI

Meta faces class-action lawsuit over Ray-Ban smart glasses privacy

Von KI berichtet Bild generiert von KI

A class-action lawsuit has been filed against Meta, accusing the company of misleading consumers about the privacy features of its Ray-Ban smart glasses. The suit follows a Swedish report revealing that contractors in Kenya reviewed sensitive footage captured by the devices, including bathroom use and intimate moments. Meta has confirmed using human reviewers for some data but claims privacy protections are in place.

Meta is developing facial recognition technology for its smart glasses, potentially launching as soon as this year, according to a New York Times report. The feature, codenamed Name Tag, aims to help users identify people they know through AI. However, privacy concerns have delayed its rollout, with the company citing a distracted political landscape as an opportunity for introduction.

Von KI berichtet

Meta has introduced new frame styles for its Ray-Ban Meta smart glasses that support prescription lenses, starting at $499 for preorder in the US with availability from April 14. The update includes AI features for nutrition logging and message summarization, alongside expanded international availability. Additional enhancements cover Oakley models and Ray-Ban Display glasses.

Captioning smart glasses that turn spoken words into real-time subtitles are gaining attention as accessibility tools. WIRED's testing highlights one standout model for its balance of features and comfort.

Von KI berichtet

Meta is creating an artificial intelligence version of its chief executive, Mark Zuckerberg, to interact with employees. The project involves photorealistic 3D characters trained on Zuckerberg's mannerisms, tone, and statements. Zuckerberg is personally training and testing the animated AI as part of the company's AI push.

Meta has announced a collaboration with Nvidia to develop a hyperscale AI infrastructure designed to support billions of users globally. The initiative integrates Meta's production workloads with Nvidia's hardware and software ecosystem. The partnership aims to enhance AI capabilities at an unprecedented scale.

Von KI berichtet

A new wearable device from MIT's AlterEgo company uses technology to interpret subtle neuromuscular signals for silent communication. The device, worn on the ears, enables tasks like conversation and device control without vocalizing words. While it offers privacy benefits, it also raises concerns about data handling in interactions.

 

 

 

Diese Website verwendet Cookies

Wir verwenden Cookies für Analysen, um unsere Website zu verbessern. Lesen Sie unsere Datenschutzrichtlinie für weitere Informationen.
Ablehnen