AI Sovereignty Series
AI governance is fragmenting into incompatible systems. Europe prioritises trust through transparency. America pursues speed through scale. China maintains control through integration. For Boards, this isn’t about flexible compliance across markets - it’s about recognising that these systems are so incompatible that trying to serve all three means serving none well.
This series examines the strategic implications of AI sovereignty for UK businesses, from energy constraints that create structural disadvantages to regulatory divergence that forces explicit strategic choices. The sovereignty trilemma reveals that organisations can optimise for trust, speed, or control - but not all three simultaneously.
Articles in This Series
UK AI Sovereignty and the Energy Challenge
The economics are stark: powering a 100MW data centre in the UK costs approximately four times what it does in the United States. This isn’t a marginal cost difference - it’s a fundamental structural disadvantage affecting every UK business using AI services. This article examines the sovereignty paradox forcing UK businesses toward foreign-hosted AI services whilst exploring strategic responses.
The EU’s GPAI Code of Practice: Strategic Implications
The EU’s General-Purpose AI Code of Practice establishes trust as the primary currency of AI value through transparency requirements and systemic risk guardrails. This article analyses the strategic implications for UK businesses serving European markets and explores how compliance requirements might become competitive advantages.
Goldman Sachs and the Energy Diplomacy of AI
Goldman Sachs’ institutional analysis transforms energy sovereignty from policy concern to strategic imperative. Their $5 trillion infrastructure projection and “data centre diplomacy” concept validate that energy access determines AI capability. This article explores what their research means for UK Boards seeking competitive positioning in an energy-constrained landscape.
A Board’s Guide to Navigating Conflicting National Agendas
AI governance is fragmenting into three incompatible visions: Europe’s trust through transparency, America’s speed through scale, and China’s control through integration. This article introduces the sovereignty trilemma and presents three strategic stances for navigating these landscapes without fracturing your strategy.
A New Grid Actor: AI Infrastructure Is Becoming Energy Infrastructure
America’s 19GW power shortfall by 2028 is forcing hyperscalers to build their own generation, but the strategic insight is what happens next: surplus capacity transforms AI infrastructure operators from energy consumers into grid actors. This article examines how distributed generation reshapes the relationship between technology companies and national grids, and what this means for UK Boards navigating energy sovereignty.
Key Concepts
The Sovereignty Trilemma: Organisations can optimise for trust, speed, or control in AI governance—but not all three simultaneously. European organisations optimising for trust accept slower development cycles. American organisations optimising for speed accept regulatory uncertainty. The trilemma forces explicit strategic choices.
Data Centre Diplomacy: Goldman Sachs’ concept describing how AI infrastructure decisions have geopolitical implications. Unlike oil reserves determined by geography, data centres can be strategically built in chosen locations, allowing nations to leverage AI infrastructure as critical geopolitical and economic tools.
Energy as Strategic Factor: Energy infrastructure is becoming the determining factor in AI competitiveness. UK businesses face a 4x energy cost disadvantage compared to US competitors, creating structural challenges that affect every AI-dependent operation.
Grid Actors: The emerging phenomenon where hyperscalers with behind-the-meter generation transition from energy consumers to grid participants, with surplus capacity creating new strategic and regulatory considerations.
Related Frameworks
This series connects to broader governance themes in my toolkit:
- Minimum Lovable Governance — Building just-enough governance
- AI Stages of Adoption — Understanding maturity implications
- Well-Advised Strategic Priorities — Balancing competing strategic demands




