HyperAI

Focus on Llama's New Scenes and AR Glasses, Zuckerberg Uses AI to Empower the Metaverse, and the Meta Connect 2024 Developer Conference Live Broadcast Is Scheduled

特色图像

The Meta Connect 2024 annual developer conference will be held at 1 a.m. Beijing time on September 26 (10 a.m. on September 25, US time).At that time, Meta CEO Mark Zuckerberg will focus on AI and the Metaverse and share Meta's latest products and services with everyone. HyperAI will be live-broadcasted on the video account at the same time~

Using open source to promote innovation and reveal the unlimited potential of the Llama model

According to the official website schedule,In this session, Chaya Nayak, Product Manager at Meta, will introduce the details of Llama, the world's most powerful open source model. In addition, she will discuss the functions and potential application areas of Llama, and share how the Meta team is building a community around the Llama model, hoping to inspire new innovation.

As we all know, in July this year, Meta open-sourced the Llama 3.1 series of large models, including 8B, 70B and 450B parameter sizes. The 450B parameter Llama model is comparable to top closed-source models such as GPT-4o and Claude 3.5 Sonnet in multiple benchmarks. After the model was released, Turing Award winner Yann LeCun proudly stated on the social platform that "large enterprises, small businesses, startups, governments, universities, etc., more and more researchers continue to improve the Llama model and propose new use cases!" This conference will explore the functions of Llama and its potential applications in depth, and is expected to further explore and expand new application scenarios for the model.

Based on the Llama model, Meta also developed its own intelligent assistant Meta AI. Zuckerberg once predicted that the assistant may surpass ChatGPT. Currently, Meta AI has been widely used in various Meta applications (such as Facebook, Instagram, Messenger, WhatsApp, etc.).At this conference, Meta's senior engineering director Christine Awad will also share with everyone how Meta uses Llama to promote the iteration and upgrade of Meta AI and other products. Using this as a reference template, it is believed that it will help more people find the direction of future learning.

Zuckerberg's next small goal is to open a new era of AR glasses

In fact, Meta AI has also been integrated into the smart glasses Ray-Ban Meta. This pair of glasses is equipped with a 12MP ultra-wide-angle lens, which can not only take clear photos and support video recording, but also analyze the images captured by the camera through the built-in large model and give responses. Users only need to say "Hey, Meta" and then ask questions to let the glasses perform operations such as foreign language translation, weather forecast, and music. The glasses were warmly welcomed as soon as they were launched, which surprised Zuckerberg and made him realize the huge potential of the glasses market.A large number of people around the world wear glasses. If these people upgrade their ordinary glasses to more advanced glasses, the potential users will exceed billions!

To this end, while Zuckerberg is iterating on smart glasses, he has also increased his research and development efforts on AR glasses.At this Connect conference, Meta’s first AR glasses will also be launched.In fact, Meta has previously announced that its next major product is AR glasses. Meta said that this AR glasses will combine the two technical paths of Quest3 head display and Ray-Ban Meta glasses, allowing users to interact immersively with the physical world and digital content while enjoying the practicality and entertainment of Meta AI to achieve the best experience.

Previously, foreign media also reported that Meta was applying for a new patent related to AR glasses.The patent combines machine learning, augmented reality (AR) and AI Agent, which can not only expand the user's visual experience, but also provide visual, auditory and even tactile auxiliary functions for users with visual impairments and hearing impairments. It is considered another major breakthrough of Meta in the field of AR. Zuckerberg also commented that the AR smart glasses under development will have higher performance than Ray-Ban glasses and are expected to completely change the way traditional glasses are used.

Expanding the Metaverse user base: the new Meta Quest 3S is more affordable

It is worth mentioning that in this conference, Meta will emphasize its Metaverse products. In this regard, in addition to the AR glasses mentioned above,Meta will also launch a new product - the Meta Quest 3S headset.

At last year's Meta Connect 2023 developer conference, Meta launched the world's first mixed reality headset, Meta Quest 3.This headset can be used in conjunction with other sensors and AI algorithms. By simply double-clicking the side of Quest 3, users can switch freely between the mixed environments of VR and MR. They can choose to enter a virtual world that completely violates the laws of physics, or they can choose to enter a real environment with many virtual elements superimposed.

GIF cover

At this year's Meta Connect 2024 conference, the streamlined version of Meta Quest 3, Meta Quest 3S, will be officially unveiled.According to the pictures exposed online, the front of the Meta Quest 3S headset is equipped with a triangular camera/sensor layout and an operation button, which is expected to be used to switch the visual perspective function or support user customization of some functions. It is worth mentioning that the functions of Meta Quest 3S are similar to Meta Quest 3, but some hardware cuts have been made in optical components and design, so it is much cheaper than Quest 3. It is speculated that this pricing strategy may allow more people to enter Meta's metaverse system.

At the same time, there was news in the market that Quest 3S had obtained official certification from the Federal Communications Commission (FCC), which undoubtedly added more possibilities for the upcoming release of the product.