AWS launch Meta Llama 3 on Amazon Bedrock

Just as we were catching our breath on Claude 3 Opus arriving on Amazon Bedrock, here we go again, this time it’s Meta’s Llama 3 model - designed for you to build, experiment, and responsibly scale your generative artificial intelligence (AI) applications; now with improvements in reasoning, code generation, and instruction. Read the announcement here
According to Meta’s Llama 3 announcement, the Llama 3 model family is a collection of pre-trained and instruction-tuned large language models (LLMs) in 8B and 70B parameter sizes. These models have been trained on over 15 trillion tokens of data — a training dataset seven times larger than that used for Llama 2 models, including four times more code, which supports an 8K context length that doubles the capacity of Llama 2.
About the Author
Mario Thomas is a Chartered Director and Fellow of the Institute of Directors (IoD) with nearly three decades bridging software engineering, entrepreneurial leadership, and enterprise transformation. As Head of Applied AI & Emerging Technology Strategy at Amazon Web Services (AWS), he defines how AWS equips its global field organisation and clients to accelerate AI adoption and prepare for continuous technological disruption.
An alumnus of the London School of Economics and guest lecturer on the LSE Data Science & AI for Executives programme, Mario partners with Boards and executive teams to build the knowledge, skills, and behaviours needed to scale advanced technologies responsibly. His independently authored frameworks — including the AI Stages of Adoption (AISA), Five Pillars of AI Capability, and Well-Advised — are adopted internationally in enterprise engagements and cited by professional bodies advancing responsible AI adoption, including the IoD.
Mario's work has enabled organisations to move AI from experimentation to enterprise-scale impact, generating measurable business value through systematic governance and strategic adoption of AI, data, and cloud technologies.