Google Unveils Advanced AR Glasses at TED Conference

Thu 10th Apr, 2025

At the recent TED Conference in Vancouver, Canada, Google showcased significant advancements in augmented reality with the demonstration of its new AR glasses, powered by the Android XR operating system. This event marked a pivotal moment in the evolution of smart glasses, as Google presented a prototype that closely resembles traditional eyewear.

The presentation featured Shahram Izadi, head of Android XR, and Nishta Bathia, product manager for glasses and artificial intelligence, who illustrated the glasses' capabilities through various live scenarios. Among the notable features was a real-time translation demonstration, where Izadi translated Farsi into English, with the translations displayed as subtitles on the glasses' screen.

Izadi highlighted that the real-time features of the glasses, which include visual recognition and auditory responsiveness, are still in their initial stages. However, a standout function called "Memory" represents a significant advancement. This feature enables the glasses to maintain a contextual awareness of the user's surroundings, allowing the device to remember visual information without explicit prompts. This is facilitated by an integrated camera designed to capture environmental details.

During the demonstration, Bathia posed a query to the AI assistant, Gemini, asking for the location of a misplaced hotel key card. The AI responded accurately, indicating that the card was on a shelf next to a vinyl record, showcasing the practical applications of the Memory function.

First introduced at Google I/O 2024, the Memory feature is part of Google's broader Project Astra initiative. Recently, Google enhanced Gemini's capabilities by giving it the ability to "see," thereby enabling the described function.

Izadi also clarified that the AR glasses are not standalone devices; they require a connection to a smartphone to function effectively. This design choice allows for a lightweight and compact form factor, with the smartphone acting as the primary processing hub. This approach is similar to Meta's "Project Orion," which utilizes an external computing unit rather than a smartphone.

The demonstrations from both Google and Meta suggest that the technology has progressed to the point where practical augmented reality applications in eyewear are becoming feasible. Reports indicate that initial consumer-ready smart glasses might be launched within this year. While Google's AR glasses are a potential candidate for release, there are indications that Samsung, a key hardware partner, may introduce its own model based on Android XR, known as "Project Haean," by the end of 2025.

Additionally, Meta is expected to unveil an upgraded version of its Ray-Ban glasses, codenamed Hypernova, which will feature an integrated display. This new model is anticipated to be priced significantly higher than existing Ray-Ban options, which currently start around EUR329 in Germany.


More Quick Read Articles »