TLDR
- Meta launched the v21 software update for its AI glasses, starting with Early Access users in the US and Canada.
- Conversation Focus amplifies nearby voices using open-ear speakers, helping users hear clearly in loud environments.
- Amplification can be adjusted by swiping the glasses’ right temple or through connected device settings.
- Spotify integration allows users to say, “Hey Meta, play a song to match this view,” to trigger a personalized playlist.
- The update uses computer vision and AI to connect real-world visuals with Spotify’s recommendation system for tailored music.
Meta has released the v21 software update for its AI glasses, bringing new features including voice amplification and Spotify interaction, and users in the Early Access Program across the US and Canada will begin receiving it today, while a broader rollout will follow gradually, making the glasses more dynamic and adaptable for varied environments throughout the holiday season.
Conversation Focus Enhances Voice Clarity in Loud Environments
Meta announced the arrival of Conversation Focus, a feature that helps users hear people more clearly in noisy surroundings. It uses the glasses’ open-ear speakers to amplify the voice of the person in front of the wearer during conversations. This update is available to Ray-Ban Meta and Oakley Meta HSTN users enrolled in the Early Access Program in the US and Canada. The feature responds to common use cases, such as dining in a crowded place or traveling on public transit.
Users can manually adjust the amplification by swiping the right temple of their glasses or via device settings as needed. Meta shared during Connect that Conversation Focus was designed to improve real-world listening experiences. The company stated, “It helps people stay tuned into the moments that matter, even in loud environments.”
Meta continues refining wearable audio by enhancing natural interactions and voice clarity using hardware and software integration. The system builds on earlier updates and user feedback to optimize real-time audio enhancements. This feature adds practical functionality to the glasses beyond camera or assistant features.
Meta AI Adds Spotify Integration to Glasses
Meta also introduced a new feature that connects Spotify with its glasses for a visual-based music experience powered by Meta AI. By saying, “Hey Meta, play a song to match this view,” users can trigger a personalized playlist.
The feature combines object recognition with Spotify’s personalization engine to deliver audio tailored to visual surroundings. Whether looking at album art or seasonal scenery, the glasses can generate music aligned with what the user sees. The experience is designed for Ray-Ban Meta and Oakley Meta models with Meta AI integration.
This is Meta’s first multimodal AI-powered music feature in collaboration with Spotify’s recommendation platform. It blends computer vision and context awareness to create a new type of music interaction. Meta aims to create real-time media experiences driven by what users hear and see simultaneously. The update is part of a broader v21 rollout, which began today for Early Access users. Meta will continue expanding the release in the coming weeks.


