AWS launch Meta Llama 3 on Amazon Bedrock

London | Published in AI | 8 minute read |     
Playful image featuring three llamas in a tech-savvy environment, surrounded by servers, cloud icons, and digital devices, highlighting the integration of technology with a playful twist. (Image generated by ChatCPG 4o)

Just as we were catching our breath on Claude 3 Opus arriving on Amazon Bedrock, here we go again, this time it’s Meta’s Llama 3 model - designed for you to build, experiment, and responsibly scale your generative artificial intelligence (AI) applications; now with improvements in reasoning, code generation, and instruction. Read the announcement here

According to Meta’s Llama 3 announcement, the Llama 3 model family is a collection of pre-trained and instruction-tuned large language models (LLMs) in 8B and 70B parameter sizes. These models have been trained on over 15 trillion tokens of data — a training dataset seven times larger than that used for Llama 2 models, including four times more code, which supports an 8K context length that doubles the capacity of Llama 2.


About the Author

Mario Thomas is a transformational business leader with nearly three decades of experience driving operational excellence and revenue growth across global enterprises. As Head of Global Training and Press Spokesperson at [Amazon Web Services](https://aws.amazon.com) (AWS), he leads worldwide enablement delivery and operations for one of technology's largest sales forces during a pivotal era of AI innovation. A Chartered Director and Fellow of the [Institute of Directors](https://www.iod.com), Mario partners with Boards and C-suite leaders to deliver measurable business outcomes through strategic transformation. His frameworks and methodologies have generated over two-billion dollars in enterprise value through the effective adoption of AI, data, and cloud technologies.