AWS launch Meta Llama 3 on Amazon Bedrock

London | Published in AI | 10 minute read | Share this post on X Share this post on LinkedIn Share this post by Email Copy the Permalink to this post Read the AWS blog post about this here

Playful image featuring three llamas in a tech-savvy environment, surrounded by servers, cloud icons, and digital devices, highlighting the integration of technology with a playful twist. (Image generated by ChatCPG 4o)

Just as we were catching our breath on Claude 3 Opus arriving on Amazon Bedrock, here we go again, this time it’s Meta’s Llama 3 model - designed for you to build, experiment, and responsibly scale your generative artificial intelligence (AI) applications; now with improvements in reasoning, code generation, and instruction. Read the announcement here

According to Meta’s Llama 3 announcement, the Llama 3 model family is a collection of pre-trained and instruction-tuned large language models (LLMs) in 8B and 70B parameter sizes. These models have been trained on over 15 trillion tokens of data — a training dataset seven times larger than that used for Llama 2 models, including four times more code, which supports an 8K context length that doubles the capacity of Llama 2.