HyperAI
Back to Headlines

Google’s Gemma AI Models Surpass 150 Million Downloads, But Still Trail Meta’s Llama

a day ago

Google's Gemma AI models have achieved a significant milestone by surpassing 150 million downloads. This announcement was made over the weekend by Omar Sanseviero, a developer relations engineer at Google DeepMind, on X, a popular social media platform. Additionally, he revealed that developers on the AI development platform Hugging Face have created over 70,000 variants of Gemma. Launched in February 2024, the Gemma AI models were designed to compete with other open-source AI families, such as Meta’s Llama. The most recent versions of Gemma are multimodal, capable of handling both images and text, and support over 100 languages. Google has also fine-tuned some Gemma models for specific applications, including drug discovery. Despite reaching 150 million downloads within about a year, Gemma still lags behind Meta’s Llama, which surpassed 1.2 billion downloads in late April 2024. However, the rapid adoption of Gemma highlights its appeal and potential in the AI community. It's important to note that both Gemma and Llama have faced criticism for their unique, non-standard licensing terms. Some developers argue that these terms introduce legal and commercial risks, making the models less appealing for use in commercial projects. This licensing issue remains a significant obstacle for widespread adoption and integration into business operations. Nevertheless, the significant download numbers and the creation of numerous variants by developers indicate strong interest and a growing ecosystem around the Gemma AI models. As the competition among AI platforms intensifies, addressing these licensing concerns will be crucial for Google to fully capitalize on the success of Gemma and maintain its leadership in the AI space.

Related Links