Meta is rolling out new software for its smart glasses that boosts real-world conversations with voice amplification and adds artificial intelligence (AI)-powered Spotify playlists that adapt to what users are looking at.
Meta is rolling out a fresh set of software updates for its smart glasses.
First teased at Meta Connect in September, Conversation Focus is now beginning to land on supported devices.
The feature uses audio processing to amplify the voices of people around you, making it easier to follow a conversation in crowded or loud environments.
When switched on, voices sound slightly brighter, helping them cut through background noise. Users can activate it with a voice command – “Hey Meta, start Conversation Focus” – or assign it to a tap-and-hold shortcut on the frames.
Meta is also expanding its partnership with Spotify through a new multimodal artificial intelligence (AI) feature.
Wearers can now ask their glasses to play music that matches what they’re looking at by saying, “Hey Meta, play a song to match this view.”
Spotify then generates a playlist tailored to both the scene and the listener’s personal taste, turning visual context – like holiday decorations or city views – into an on-the-fly soundtrack.
The updates are rolling out to Meta Ray-Ban smart glasses (Gen 1 and Gen 2) and Oakley Meta HSTN frames, starting with users enrolled in Meta’s early access program before expanding more broadly.
Meanwhile, Meta’s Oakley Meta Vanguard shades are gaining single-word voice commands. Saying “photo” or “video” is now enough to capture content – a small but thoughtful tweak aimed at athletes who do not want to waste breath mid-run or ride.
Together, the updates underline Meta’s push to make smart glasses feel less futuristic – and more genuinely useful.
Meta bringing Conversation Focus and AI-powered Spotify features to its smart glasses







