Cookie Consent

I use cookies to understand how my website is used. This data is collected and processed directly by me, not shared with any third parties, and helps us improve our services. See my privacy and cookie policies for more details.

Rethinking Business Cases in the Age of AI: Building Your AI Business Case

London | Published in AI and Board | 18 minute read |    
A professional team collaborates around a conference table reviewing an AI business case document. Digital displays show multi-dimensional value metrics, ROI projections across different time horizons, and strategic alignment graphics. The scene conveys analytical rigour combined with strategic vision in building a compelling AI investment case. (Image generated by ChatGPT 4o).

In my previous articles, I’ve explored why traditional business cases fall short for AI investments, outlined the essential building blocks for effective evaluation, and provided a methodology for identifying high-value AI opportunities. Now we turn to the practical process of constructing business cases that capture AI’s unique characteristics while providing boards with the clarity they need for confident decision-making.

Having identified promising AI opportunities, the challenge shifts to articulating their value in ways that resonate with decision makers. Through my interactions with Chartered Directors and their Boards, I’ve observed a consistent pattern: the most successful AI business cases don’t simply adapt traditional templates — they fundamentally rethink the approach to capture AI’s distinctive value creation patterns.

The process I’m sharing isn’t theoretical, it’s built from practical experience guiding organisations through AI evaluation and investment decisions. While maintaining financial rigour, it expands beyond conventional models to address AI’s multi-dimensional impact. This approach helps boards understand not just the ‘what’ of AI investment but the ‘why’ and ‘how’ that determine long-term success.

Defining Strategic Intent with the Well-Advised Balanced Scorecard

The foundation of any compelling AI business case lies in clearly articulating strategic intent. While traditional business cases often begin with financial projections, effective AI business cases start with strategic alignment that answers fundamental questions: What business objectives does this initiative advance? How does it align with our broader strategic vision? Which stakeholders benefit and how?

This strategic narrative should explicitly connect the initiative to your organisation’s priorities using the Well-Advised Framework. Developed by me at AWS as an executive-focused counterpart to the technically-oriented Well-Architected Framework, Well-Advised is a universally adaptable balanced scorecard for evaluating AI investments, ensuring strategic alignment across any organisation or industry.

While other evaluation frameworks exist - such as Robert Kaplan and David Norton’s balanced scorecard with its financial, customer, internal process, and learning perspectives - Well-Advised’ inclusion of Responsible Business Transformation as a distinct pillar makes it particularly well-suited for AI business cases, where ethical considerations, governance, and responsible implementation are increasingly critical success factors.

I recommend creating a strategic alignment matrix that maps the initiative against each Well-Advised pillar with specific metrics for each dimension. For example, a manufacturer implementing AI-driven predictive maintenance might display strategic alignment across multiple pillars:

Innovation & New Products/Services:

Customer Value & Growth:

Operational Excellence:

Responsible Transformation:

Revenue, Margin & Profit:

Balancing the Scorecard

For AI portfolio management, the Well-Advised pillars suggest a balanced investment approach:

This distribution ensures a balanced portfolio delivering value across all strategic dimensions rather than overweighting initiatives in any single area. The exact distribution should reflect your organisation’s strategic priorities and current position in the AI Stages of Adoption.

Beyond this structured alignment, the qualitative narrative should address the initiative’s impact on key stakeholders. What changes will employees experience? How will customers’ interactions evolve? What new capabilities will the organisation develop? These qualitative dimensions provide crucial context for subsequent quantitative projections.

The strategic narrative should explicitly address the initiative’s position in your AI Stages of Adoption journey. Is this an experimental initiative designed to build initial capabilities? An adoption-stage project scaling proven approaches? Or a transformative initiative reshaping core business processes? This context helps boards understand how the initiative fits within your overall AI maturity progression.

By using Well-Advised as a balanced evaluation framework for AI business cases, you maintain consistency with your established strategic approach while ensuring comprehensive assessment across all dimensions that matter to your organisation.

Building the AI Cost Structure

AI initiatives involve cost categories that differ significantly from traditional technology investments. Effective business cases must capture these unique patterns while providing realistic projections that avoid common pitfalls.

I recommend structuring AI costs across three dimensions (similar to cloud adoption business cases - with some notable additions):

Implementation Costs

Implementation costs extend well beyond technology expenses to include:

A crucial difference from traditional implementations is the iterative nature of these costs. Unlike one-time expenses, AI implementation often involves multiple refinement cycles as models are developed, tested, and improved. Business cases should reflect this pattern rather than assuming linear deployment.

This refinement cycle is somewhat analogous to cost optimisation for cloud workloads. As new generations of compute became available on the cloud, we saw corresponding decreases in costs. In AI terms we need to think about models becoming more sophisticated, whilst at the same time, GPU hardware becoming faster. This should lead to reduced costs over time for like-for-like workloads, but there could also be an increase in some cases because of the need for retraining and fine-tuning.

Operational Costs

Operational expenses for AI systems include several categories that require special consideration:

The most accurate business cases separate these operational costs from implementation expenses to provide realistic ongoing cost projections. They also account for how these costs evolve as the initiative matures through the AI Stages of Adoption, with particular attention to scaling considerations. As I’ve discussed in previous articles, there are significant cross-initiative economies of scale to be gained as organisations progress through the AI adoption journey. Shared data infrastructure, governance frameworks, and technical expertise can be leveraged across multiple AI initiatives, potentially reducing the marginal cost of each additional project. Business cases should explicitly acknowledge these economies of scale to avoid overestimating costs when evaluating portfolios of AI initiatives.

Another important consideration is the variable and sometimes unpredictable nature of operational costs for certain AI workloads. Unlike traditional IT implementations with stable usage patterns, user-facing AI applications such as chatbots can experience significant demand fluctuations, making cost forecasting particularly challenging. The inherent unpredictability of user interactions can lead to considerable variations in inference costs and resource requirements. In contrast, operational AI applications like predictive maintenance typically process more consistent data volumes with established parameters, creating more predictable cost profiles. Business cases should differentiate between these types of applications and include appropriate contingencies for more unpredictable workloads to avoid mid-implementation budget surprises.

Transition Costs

Transition costs address the organisational changes required as AI transforms processes:

Transition costs are frequently underestimated or entirely omitted from traditional business cases, yet they often determine success or failure in practice. The most effective business cases explicitly account for these expenses and spread them realistically across implementation phases.

For each cost category, I recommend including both direct financial expenses and resource requirements such as staff time, leadership attention, and organisational capacity. This comprehensive view helps boards understand the full investment required beyond simple budget allocations.

Mapping Value Within the Well-Advised Framework

Building on the Well-Advised balanced scorecard approach, effective AI business cases must trace how value creation manifests specifically within each pillar. Rather than introducing new dimensions, we can map AI capabilities directly to outcomes using Well-Advised, creating a coherent story of value creation that remains consistent with the strategic narrative.

For each Well-Advised pillar, effective business cases must trace how AI capabilities connect to specific outcomes and establish a comprehensive measurement approach spanning different time horizons. The following framework helps boards understand both the causal relationships and how success will be measured across leading, lagging, and predictive indicators.

Well-Advised PillarValue Creation OutcomesLeading IndicatorsLagging IndicatorsPredictive Indicators
Innovation & New Products/ServicesEnhanced R&D capabilities leading to faster product development cycles; New AI-enabled offerings creating previously impossible value propositions; Platform capabilities enabling ecosystem-based innovation; Data-driven insights uncovering unmet customer needsPrototype development velocity; AI experiment completion rates; User feedback quality scores; Ideation volume from AI systemsNew product revenue; Market share in new segments; Patent applications; Time-to-market reductionInnovation opportunity forecasts; Market trend predictions; Technology adoption projections; Competitive positioning models
Customer Value & GrowthIncreased personalisation improving customer satisfaction and loyalty; Enhanced service capabilities driving retention and expanded relationships; Improved targeting precision increasing acquisition effectiveness; AI-enhanced experiences creating competitive differentiationUser engagement with AI features; Early adoption metrics; Customer feedback sentiment; Service response improvementsCustomer satisfaction scores; Net Promoter Score; Customer lifetime value; Retention rate improvementsChurn probability models; Customer need predictions; Market segment growth forecasts; Personalisation impact simulations
Operational ExcellenceAutomated processes reducing manual effort and error rates; Predictive capabilities improving resource allocation and utilisation; Enhanced quality control systems reducing waste and rework; Process optimisation compressing cycle times and improving throughputProcess improvement early results; Error reduction in pilot areas; System reliability metrics; AI-human collaboration efficiencyCost per transaction; Labour productivity gains; Quality improvement metrics; Cycle time reductionsProcess bottleneck predictions; Resource optimisation models; Maintenance requirement forecasts; Capacity utilisation projections
Responsible TransformationImproved compliance monitoring reducing regulatory exposure; Enhanced risk detection enabling proactive mitigation; Sustainability improvements through resource optimisation; Workforce augmentation enabling focus on higher-value activitiesGovernance framework adoption; Risk assessment coverage; Bias testing results; Sustainability initiative metricsCompliance incident reduction; Risk exposure metrics; Carbon footprint impact; Workforce skill transformationRegulatory change forecasts; Ethics violation probability; Sustainability impact projections; Workforce transition models
Revenue, Margin & ProfitDirect cost reduction through operational improvements; Revenue enhancement through improved sales effectiveness; Margin expansion through pricing optimisation; Profit growth through combined efficiency and growth effectsEarly revenue indicators; Cost reduction in pilot areas; Pricing optimisation test results; Sales conversion improvementsRevenue growth; Margin expansion; Cost structure improvements; Return on AI investmentMarket opportunity forecasts; Profit growth simulations. Economic scenario modelling. Revenue stream diversification projections

This integrated framework connects the “what” (value creation outcomes) with the “how” (measurement approach) for each Well-Advised pillar. It provides boards with a comprehensive view of value creation that spans from initial pilots to long-term transformation, tracking progress across all stages of AI adoption through:

Business cases should clearly articulate which metrics will be tracked at each stage of implementation, creating a cohesive narrative of value creation that spans from initial pilots to long-term transformation.

Creating Value Chain Visualisations

To help boards understand these value connections, I recommend creating visual mappings that show how specific AI capabilities flow through to business outcomes within each Well-Advised pillar. For example, an AI contract analysis system might show a value chain that demonstrates:

AI CapabilityOperational ImpactStrategic OutcomeFinancial Value
Document processing75% faster review timeIncreased contract throughput£450K annual labour cost reduction
Pattern recognition60% error reductionImproved compliance and auditability£300K compliance risk reduction
Knowledge extractionEnhanced contract analyticsStronger negotiation position£400K improved contract terms

These visualisations help boards understand both the direct impacts and the network effects that emerge as AI capabilities mature and extend across the organisation. They maintain consistency with our Well-Advised framework while providing the granular cause-effect mapping boards need to evaluate investment cases.

Aligning ROI Horizons with AI Stages of Adoption

Traditional ROI calculations typically focus on a single time horizon—often 1-3 years—which systematically undervalues AI’s longer-term and compound impacts. Rather than using arbitrary timeframes, I recommend aligning return horizons directly with the AI Stages of Adoption framework, creating a more cohesive approach to value measurement.

This alignment helps boards understand how value profiles evolve as initiatives mature through different adoption stages, providing a more intuitive way to evaluate investment cases. It also reinforces the reality that AI initiatives create different types of value at different stages of maturity.

Experimenting and Early Adopting Stage Returns (0-12 months)

The initial stages focus on validating approaches and building foundational capabilities:

While financial metrics provide some validation in this horizon, capability building and learning often represent the most significant value. Early KPIs should include user adoption rates, accuracy measures, time savings, and initial financial impacts, but equally important are the insights gained and capabilities established.

Adopting and Optimising Stage Returns (1-2 years)

As initiatives progress through adoption and into optimisation, value scales across the organisation:

This phase typically shows accelerating returns as fixed investments in data, infrastructure, and governance support broader application. ROI calculations should reflect this increasing return rate rather than assuming linear progression. The value created during this stage often justifies the initial investment made during experimentation.

Transforming and Scaling Stage Returns (2+ years)

In the most advanced stages, AI fundamentally reshapes core business capabilities:

While these longer-term impacts involve greater uncertainty, they often represent the most significant value potential. The most effective business cases acknowledge this uncertainty while providing scenario-based projections rather than single-point estimates.

For each adoption stage, I recommend including both financial metrics (NPV, ROI, payback period) and strategic impact measures aligned with the Well-Advised Framework. This balanced approach helps boards evaluate both immediate returns and long-term strategic value, making more informed investment decisions than traditional approaches allow.

This adoption-aligned approach is particularly important for organisations in earlier AI Stages of Adoption. Initiatives in the Experimenting and early Adopting stages often show modest initial returns while building capabilities that enable substantially greater value in later stages. Without this stage-based view, boards might prematurely terminate promising initiatives based on initial returns alone, missing their long-term transformative potential.

Designing Validation and Scaling Plans

Effective AI business cases don’t just project outcomes—they outline how value will be validated and then scaled across the organisation. This structured approach provides boards with confidence that investments will deliver claimed returns while anticipating expansion requirements.

The validation and scaling plan should address several key dimensions:

Pilot Design and Success Criteria

The validation approach should define:

Effective pilot designs balance sufficient scope to demonstrate value with appropriate constraints to manage risk. They focus on validating not just technical performance but business impact—confirming that AI capabilities translate to meaningful outcomes in real operational contexts.

Scaling Roadmap and Requirements

The scaling plan should outline:

The most effective scaling plans align with your overall position in the AI Stages of Adoption journey. Organisations in earlier stages should focus on building foundational capabilities that enable future scaling, while those in more advanced stages can emphasise broader deployment and transformation potential.

Capability Development Strategy

As I outlined in the Five Pillars framework, sustainable AI value requires developing capabilities across multiple dimensions. The business case should address how the initiative will build these capabilities through:

This capability view helps boards understand how individual initiatives contribute to broader organisational readiness. It emphasises the platform value created alongside direct business returns, justifying investments that might otherwise appear marginal when viewed in isolation.

The most compelling business cases present validation and scaling not as sequential processes but as integrated approaches to learning and value creation. They position initial implementations as strategic assets that generate both immediate returns and critical insights for future expansion.

This approach helps overcome the traditional business case challenge of requiring comprehensive plans before implementation. By structuring initiatives as learning-oriented investments with clear validation criteria, organisations can move forward despite initial uncertainty while maintaining appropriate governance and accountability.

Integrating the Components into a Coherent Business Case

While we’ve explored each component separately, effective AI business cases integrate them into a coherent narrative that helps boards understand the complete value proposition. This integration should balance rigorous analysis with strategic vision while focusing on the dimensions that boards care about most.

I recommend structuring AI business cases around four core sections that align with our established frameworks:

  1. Strategic Intent and Well-Advised Alignment: How the initiative advances organisational priorities across the Well-Advised dimensions, with clear connections to strategic goals and stakeholder benefits. This section uses the balanced scorecard approach to demonstrate comprehensive strategic alignment.

  2. Investment Requirements and Cost Structure: Comprehensive financial projections across implementation, operational, and transition dimensions, with phased resource requirements and consideration of cross-initiative economies of scale.

  3. Value Creation and Stage-Based Returns: Multi-horizon value projections aligned with AI Stages of Adoption, mapping how specific AI capabilities drive outcomes within each Well-Advised pillar.

  4. Validation and Scaling Strategy: Pilot approach, success criteria, expansion pathways, and capability development plans that build organisational readiness across the Five Pillars framework.

Each section should include both narrative explanations and structured metrics, creating a balanced case that appeals to different stakeholder perspectives. The financial analysis should maintain traditional rigour while incorporating AI’s unique characteristics, particularly around iterative development, capability building, and non-linear value creation.

For many organisations, particularly those in earlier AI maturity stages, the AI Centre of Excellence plays a crucial role in business case development. The AI CoE can provide standardised templates, governance oversight, and expertise that ensure consistent high-quality business cases across initiatives. This centralised approach helps boards compare opportunities effectively while maintaining appropriate governance standards.

Conclusion: Turning Clarity into Confidence

AI business cases cannot be lifted from traditional templates. They must reflect AI’s iterative, cross-functional nature, its shifting cost structures, and its compound impact over time. By using the Well-Advised Framework, boards can assess AI opportunities with a lens that balances innovation, ethics, operational rigour, and financial value.

The most successful organisations don’t just justify AI investment — they create shared confidence in the path ahead. This clarity enables faster decision-making, reduces implementation risk, and sets the stage for long-term transformation.

Next Steps

To put this into action, I recommend starting with three concrete steps:

  1. Select one AI opportunity and map its strategic alignment using the Well-Advised Scorecard.
  2. Construct a value chain visualisation showing how that initiative delivers value across dimensions.
  3. Model the cost structure and ROI profile across the AI Stages of Adoption to show both immediate and long-term returns.

Boards don’t need perfect foresight to approve AI investments — they need disciplined business cases that reflect reality, risk, and readiness.

In the final article of this series, we’ll explore how to present AI business cases effectively to Boards, addressing common questions and concerns and ensuring stakeholder buy-in while building the understanding needed for confident decision-making. This presentation guidance will help ensure that well-constructed business cases receive appropriate consideration and support from key stakeholders.

Let's Continue the Conversation

I'm interested in hearing about your organisation's experience with building AI business cases. What aspects have you found most challenging? Which approaches have proven most effective in gaining board approval for AI investments?




About the Author

Mario Thomas is a transformational business leader with nearly three decades of experience driving operational excellence and revenue growth across global enterprises. As Head of Global Training and Press Spokesperson at Amazon Web Services (AWS), he leads worldwide enablement delivery and operations for one of technology's largest sales forces during a pivotal era of AI innovation. A Chartered Director and Fellow of the Institute of Directors, and an alumnus of the London School of Economics, Mario partners with Boards and C-suite leaders to deliver measurable business outcomes through strategic transformation. His frameworks and methodologies have generated over two-billion dollars in enterprise value through the effective adoption of AI, data, and cloud technologies.