HyperAIHyperAI

Command Palette

Search for a command to run...

Meta Unveils $800 AI Glasses Amid Slow AI Progress, Superintelligence Push Faces Hurdles

At Meta’s annual Connect conference, the company unveiled its latest $800 Ray-Ban smart glasses, a high-profile hardware launch that overshadowed its ambitious AI ambitions. The new glasses feature a built-in display capable of showing text messages, maps, and real-time captions overlaid on the user’s view of the world. Yet, despite the fanfare, the AI capabilities powering the device were underwhelming and barely demonstrated during the event. Mark Zuckerberg introduced LiveAI, a hands-free mode designed to act as a real-time conversational assistant by using the glasses’ camera and microphone to interpret the user’s surroundings and deliver instant suggestions. The demo faltered, with Zuckerberg citing poor venue WiFi as the culprit. A later showcase of a new metaverse feature allowed users to generate complex 3D environments in Meta’s Quest headsets with simple text prompts—highlighting the company’s focus on AI-driven creativity, but still far from a fully realized product. This contrast underscores a growing disconnect: while Meta has poured billions into AI, including a $14.3 billion investment in Scale AI and a $72 billion infrastructure budget for 2024, the tangible results are still emerging. The company launched its new Superintelligence division in June, aiming to build systems that could assist users in real time by seeing, hearing, and understanding their world—what Zuckerberg calls “personal superintelligence.” But the technology remains largely in development. Alexandr Wang, who now leads the Superintelligence team after joining Meta from Scale AI, acknowledged the rapid pace of change. In a recent interview, he described building an AI lab from scratch in just 60 days as “incredible,” but also admitted the effort is still in its early stages. The transformation has been sweeping: Meta restructured its entire AI organization, dissolving its former AGI unit and creating four new teams focused on research, products, training, and infrastructure. Even Meta’s renowned AI research arm, FAIR, now reports to Wang, along with top executives like former GitHub CEO Nat Friedman and Chief AI Scientist Yann LeCun. Wang also launched a new elite lab, TBD, dedicated to developing next-generation AI models. While the internal shake-up is disruptive, it reflects Meta’s urgency to catch up with rivals like OpenAI, Google, and Anthropic. Meanwhile, competition is heating up. Google is developing its own AI glasses with a display, OpenAI is collaborating with Jony Ive on a pocket-sized AI device, Snap is preparing holographic AR glasses, and Amazon has already entered the space with smart eyewear for calls and music. Meta’s new glasses will be available in a few weeks, but the real test lies ahead: when will the company deliver AI-powered features that live up to the hype? For now, the hardware debut may be the headline, but the AI revolution is still unfolding behind the scenes.

Related Links