Meta announced Monday that it has expanded its Ray-Ban smart glasses capabilities, adding real-time language translation and AI-powered video features through a software update. The new functionality, available to Early Access ProgramME members, implements features first revealed at the company's September Connect conference.
The v11 software update introduces an AI assistant that can now process and respond to visual information in real-time. Users can communicate across language barriers as the glasses translate conversations between English and three other languages - Spanish, French, and Italian. The translation feature operates through the glasses' open-ear speakers and provides text transcripts viewable on the user's phone.
"When you're talking to someone speaking one of those three languages, you'll hear what they say in English through the glasses' open-ear speakers or viewed as transcripts on your phone, and vice versa," Meta explained in its blog announcement.
The update also brings Shazam integration to U.S. and Canadian users, allowing them to identify songs through their smart glasses. Additional features include tools for setting reminders and the ability to scan QR codes and phone numbers using voice commands. These capabilities were initially announced during Meta's annual Connect event in September.
The software update began rolling out Monday to Meta's Early Access Programme members, bringing these AI-enhanced features to the latest Ray-Ban Meta smart glasses version. This release implements the technology roadmap outlined by Meta during its September conference, expanding the wearable device's capabilities through AI integration and multilingual support.