Meta updates smart glasses with conversation focus and Spotify AI

Meta is rolling out a software update for its smart glasses that introduces Conversation Focus, a feature to amplify voices in noisy environments. The update also adds AI-powered Spotify integration for context-based playlists. These enhancements are available first to early access users on Ray-Ban and Oakley models.

Meta announced a year-end software update for its lineup of smart glasses, fulfilling a promise made in September at the Meta Connect developer conference. The key addition is Conversation Focus, which uses the glasses' beam-forming microphones to directionally amplify the voice of the person in front of the wearer while filtering out background noise in crowded or noisy settings. "You’ll hear the amplified voice sound slightly brighter, which will help you distinguish the conversation from ambient background noise," Meta explains. Users can activate it with voice commands like "hey Meta, start Conversation Focus" or by setting a tap-and-hold shortcut.

The V21 update also introduces a multimodal AI feature in partnership with Spotify. By saying “hey Meta, play a song to match this view,” the glasses analyze what the user is looking at—such as holiday decorations—and generate a personalized playlist based on the user's taste and the scene. This capability draws comparisons to upcoming features from Google for its glasses next year.

The rollout targets Meta Ray-Ban glasses (both first- and second-generation models) and Oakley Meta HSTN frames, starting with early access participants before gradually reaching all users. For the Oakley Meta Vanguard shades, aimed at athletes, the update enables single-word voice commands like "photo" to snap a picture or "video" to start recording, helping users "save some breath" during runs or bike rides. Additional features include voice command shortcuts and programmable workouts integrated with Garmin watches.

Similar to FDA-approved hearing assistance glasses from Nuance Audio, Conversation Focus aims to improve accessibility in social situations. The firmware updates may take time to propagate, potentially delaying availability for some users.

Awọn iroyin ti o ni ibatan

Realistic illustration of Ray-Ban Meta smart glasses amid a class-action lawsuit over privacy breaches, featuring courtroom elements and data review imagery.
Àwòrán tí AI ṣe

Meta faces class-action lawsuit over Ray-Ban smart glasses privacy

Ti AI ṣe iroyin Àwòrán tí AI ṣe

A class-action lawsuit has been filed against Meta, accusing the company of misleading consumers about the privacy features of its Ray-Ban smart glasses. The suit follows a Swedish report revealing that contractors in Kenya reviewed sensitive footage captured by the devices, including bathroom use and intimate moments. Meta has confirmed using human reviewers for some data but claims privacy protections are in place.

At Google's New York offices, prototypes of smart glasses demonstrated advanced features like real-time translation and app integration. These devices, blending AI assistance with wearable tech, are set to launch in 2026 from major companies. The trend signals a shift toward everyday augmented reality companions.

Ti AI ṣe iroyin

Meta is developing facial recognition technology for its smart glasses, potentially launching as soon as this year, according to a New York Times report. The feature, codenamed Name Tag, aims to help users identify people they know through AI. However, privacy concerns have delayed its rollout, with the company citing a distracted political landscape as an opportunity for introduction.

Meta is discontinuing its standalone Workrooms app for virtual reality meetings on February 16, 2026, amid broader efforts to reduce spending on the metaverse. The company is laying off more than 1,000 employees from its Reality Labs division and closing three VR studios. This shift prioritizes investments in AI hardware, such as smart glasses.

Ti AI ṣe iroyin

Modern noise-canceling headphones now feature automatic conversation detection, allowing users to engage in talks without pausing their audio manually. This technology, available on flagship models from Apple, Samsung, Google, and Sony, uses built-in microphones to sense speech and adjust settings seamlessly. It bridges the gap between immersive listening and real-world conversations effortlessly.

At CES 2026 in Las Vegas, Lumus demonstrated advanced waveguides that promise to enhance smartglasses with significantly wider fields of view. The company's Z-30 model offers a 30-degree FOV, while a prototype achieves 70 degrees, potentially transforming wearable optics.

Ti AI ṣe iroyin

Google has showcased new Gemini AI integrations for its TV platform at CES 2026, offering voice-controlled settings adjustments and enhanced photo features. Demonstrations highlighted practical tools alongside more creative but less essential options. The updates aim to make smart TVs more interactive for everyday users.

 

 

 

Ojú-ìwé yìí nlo kuki

A nlo kuki fun itupalẹ lati mu ilọsiwaju wa. Ka ìlànà àṣírí wa fun alaye siwaju sii.
Kọ