Meta has unveiled the largest iteration of its mostly free Llama 3 artificial intelligence models, highlighting multilingual capabilities and performance metrics that compete closely with paid models from rivals like OpenAI. According to Meta, the new Llama 3 model can engage in conversations in eight languages, produce higher-quality computer code, and tackle more complex math problems compared to its predecessors. The latest version, boasting 405 billion parameters, significantly surpasses the previous model released last year but remains smaller than leading models from competitors. OpenAI's GPT-4, for instance, is reported to have one trillion parameters, while Amazon is developing a model with 2 trillion parameters.
Meta's CEO, Mark Zuckerberg, promoted Llama 3 across multiple platforms, expressing his belief that future Llama models would surpass proprietary competitors by next year. He also mentioned that the Meta AI chatbot, powered by these models, is on track to become the most popular AI assistant by the end of this year, with hundreds of millions of users already engaging with it.
This release comes amidst a competitive race among tech companies to demonstrate that their resource-intensive large language models can offer substantial improvements in areas such as advanced reasoning, justifying the significant investments made. However, Meta's leading AI scientist has suggested that these models might encounter limitations in reasoning, indicating that other types of AI systems may be required for major breakthroughs.
In addition to the flagship 405 billion parameter model, Meta is also rolling out updated versions of its more lightweight 8 billion and 70 billion parameter Llama 3 models, initially introduced in the spring. All three models are multilingual and can manage larger user requests through an expanded "context window," which, according to Ahmad Al-Dahle, Meta's head of generative AI, will enhance the experience of generating computer code.
We use cookies to ensure you get the best experience on our website. Read more...