Select the directory option from the above "Directory" header!

Menu
Meta lets loose second generation of Llama AI models

Meta lets loose second generation of Llama AI models

Meta says organisations can download Llama 2 for free, and run it wherever they wish, for research and commercial purposes.

Credit: Dreamstime

Facebook-parent Meta has opened up access to Llama 2, the second generation of its Llama family of open source large language models (LLMs).

These models can be accessed via generative AI services providers such as Microsoft, AWS and Hugging Face among others, and are available under a new license that permits commercial use.  

“We’re now ready to open source the next version of Llama 2 and are making it available free of charge for research and commercial use. We’re including model weights and starting code for the pretrained model and conversational fine-tuned versions too,” Meta wrote in a blog post.

The Llama 2 family of large language models (LLMs) is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion, 13 billion, and 70 billion parameters, including fine-tuned LLMs, called Llama-2-Chat.

The tuned models make use of supervised fine-tuning and reinforcement learning with human feedback, according to Meta. While the tuned models are mostly intended for chat-based assistants, the pre-trained models can be used for a variety of natural language generation tasks, the company said.

Microsoft customers will be able to access the new generation of LLMs via the Azure AI model catalog, Meta said.

“The models are also optimised to run locally on Windows, giving developers a seamless workflow as they bring generative AI experiences to customers across different platforms,” it said.

AWS customers can access the new version of the LLMs via Amazon SageMaker JumpStart, a machine learning hub that comes with large language models and other tools such as the SageMaker Studio.

“You can now discover and deploy Llama 2 with a few clicks in Amazon SageMaker Studio or programmatically through the SageMaker Python SDK, enabling you to derive model performance and MLops controls with SageMaker features such as Amazon SageMaker Pipelines, Amazon SageMaker Debugger, or container logs,” AWS said in a blog post.

Llama 2 models are available initially in us-east 1 and us-west 2 cloud regions, AWS added.

Enterprises and researchers can also download the new models from Hugging Face’s portal.

While Meta describes the Llama 2 models as open source, it imposes a long list of restrictions on what they can be used for in its acceptable use policy and provides a guide for their responsible use.


Follow Us

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags Meta

Show Comments