Cookie Consent

I use cookies to understand how my website is used. This data is collected and processed directly by me, not shared with any third parties, and helps us improve our services. See my privacy and cookie policies for more details.

The Invisible Asset: Why Boards Should Govern Data Like It's on the Balance Sheet

Llantwit Major | Published in AI , Board and Data | 9 minute read |    
State-of-the-art industrial robotic machinery installed in a grand but structurally compromised room with crumbling ornate plaster walls, peeling paint, arched windows letting in natural light, and debris scattered across a deteriorating parquet floor—a visual metaphor for investing in advanced AI capabilities without addressing underlying data quality foundations (Image generated by ChatGPT 5.2)

Businesses treat expensive assets like office buildings, machinery, vehicles, and data centres as tangible assets on the balance sheet — affording them a degree of stewardship. Data assets — which may now contribute more to enterprise value than the buildings that house them — sit outside the balance sheet due to their intangible nature, and outside this governance discipline. IT protects data operationally through backups and security controls, but stewardship in the strategic sense is largely absent. Ownership fragments across functions. Quality degrades undetected. No one assesses whether data assets are fit for the purposes now being asked of them. A physical asset treated this way would be condemned. Data, in governance terms, already is.

The value being neglected is not marginal. Intangible assets now represent 90% of S&P 500 market value. Global intangible value reached an all-time high of USD 97.6 trillion in 2025, yet the majority remains unreported on company financial statements — invisible to the governance mechanisms that would otherwise demand stewardship. This invisibility carries a concrete price tag: poor data quality costs organisations an average of USD 12.9 million annually.

As AI becomes central to competitive strategy, the cost of this neglect compounds. 63% of organisations either lack or are uncertain about the right data management practices for AI — meaning data quality gaps cascade into model failures, compliance risks, and missed opportunities. The Institute of Directors reports that a quarter of directors are concerned about the lack of an internal AI policy, strategy, or data governance framework in their organisations.

Boards cannot wait for accounting standards to catch up. The competitive imperative exists now. The question isn’t whether data should appear on the balance sheet — it’s whether Boards will govern it as if it does.

The accounting standards gap

The invisibility of data assets isn’t accidental — it’s structural. IAS 38 Intangible Assets explicitly prohibits capitalising internally generated intangibles, including databases and proprietary data assets. Because research costs must be expensed immediately, and development costs can only be capitalised if strict criteria are met, organisations expense most data investments as they occur. The balance sheet never sees assets that may drive the majority of enterprise value.

The International Accounting Standards Board recognises the problem — a 2025 project is actively reviewing IAS 38, aiming to make the standard more suitable for newer types of intangible items and new ways of using them. But accounting standards reform moves slowly, and substantive changes remain years away from implementation. Boards cannot wait. They must govern assets that accounting rules, for now, pretend don’t exist.

The gap between what companies are worth and what appears on their balance sheets has never been wider. The S&P 500’s market value now exceeds its reported book value — for technology-intensive firms, the multiple is higher still. Three-quarters of what investors are paying for doesn’t appear in the accounts. That missing value has no formal owner, no condition assessment, no maintenance schedule.

This matters more now than ever. AI systems are only as good as the data they’re trained on. Poor data quality doesn’t merely waste investment — it actively degrades AI performance, creating model drift, biased outputs, and compliance exposure that compounds over time.

Poor data quality doesn’t stop at inefficiency — it cascades into ethical pitfalls when AI decisions affect customers, legal exposure when regulatory requirements go unmet, and financial shortfalls when AI investments fail to deliver projected returns. The Six Concerns I’ve written about previously reveal how data governance failures amplify across the entire governance landscape, touching strategic alignment, ethical responsibility, financial impact, risk management, stakeholder confidence, and capability building simultaneously.

Accounting standards may eventually catch up. Boards cannot afford to wait. The next question is practical: what would it mean to steward data as if it were on the balance sheet?

The stewardship disciplines

The disciplines aren’t mysterious — they’re the same ones Boards already apply to physical assets: regular condition assessments, clear ownership, preventive maintenance, impairment testing, strategic value review. The challenge isn’t inventing new governance mechanisms, but extending existing ones to data.

Now consider how most organisations actually treat data. Ownership fragments across functions, with no single point of accountability. Quality assessment is sporadic if it occurs at all — many organisations do not systematically measure data quality — and maintenance tends to be reactive, triggered by crisis rather than scheduled discipline. Impairment goes undetected until an AI initiative fails or a compliance gap surfaces. Investment decisions proceed without any clear picture of existing data asset condition.

What does this look like in practice? A named owner for customer data, not a shared assumption that “marketing handles it.” Quality metrics that track completeness, accuracy, and timeliness — reviewed with the same regularity as financial reporting. Active curation that retires stale datasets before they contaminate AI training. And a simple question asked of every data asset: is this still fit for the purposes now being asked of it?

Most organisations start from a low base. They have policies — acceptable use documents, data classification schemes — but the operational discipline to enforce them is absent. That gap between policy and practice widens as AI adoption scales.

Shadow AI usage compounds these challenges. Research shows 22% of files uploaded to generative AI tools contain sensitive content, with 4.37% of prompts including sensitive information. Particularly concerning, 26.3% of sensitive prompts flow through free consumer accounts rather than enterprise-governed tools. Without data governance discipline, shadow AI becomes shadow data exposure — risk multiplying beneath the surface while governance attention focuses elsewhere.

The governance architecture

Data stewardship needs a home — and for most organisations, the AI Centre of Excellence (AI CoE) provides the natural governance vehicle. It reports to the risk committee with Board-level visibility, ensuring data strategy receives appropriate senior attention rather than being delegated to IT functions where strategic perspective may be limited.

The AI CoE’s role in data governance encompasses defining data quality standards and measurement approaches, establishing ownership accountability across data domains, connecting data strategy to business objectives, overseeing the balance between data accessibility for innovation and protection for compliance, and providing Board-level reporting on data asset health. This isn’t about creating bureaucracy — it’s about ensuring existing governance mechanisms include data within their scope.

Data governance integrates naturally with existing mechanisms rather than requiring new structures. It connects to every strategic priority: innovation depends on quality data to fuel experimentation, customer value depends on accurate data to drive personalisation, operational excellence depends on reliable data to enable automation, and financial performance depends on trustworthy data to support AI-driven analysis. The question for Boards isn’t where data governance sits — it’s whether anyone is accountable for it at all.

Different functions operate at different AI maturity levels, and their data readiness varies accordingly. Marketing may have sophisticated customer data capabilities while operations struggles with fragmented systems. Finance may have rigorous data governance for regulatory reporting while product development relies on informal data sharing. The AI CoE must navigate this multi-speed reality, providing proportionate governance that enables advanced functions while building foundational capabilities elsewhere.

The AI-specific implications of data governance have intensified. Data as training fuel makes quality more critical than ever. Model performance degrades with poor data quality — not just in accuracy but in fairness, reliability, and compliance. Proprietary datasets create competitive moats that strengthen over time; generic data creates no advantage. The 3× revenue-per-employee growth in AI-exposed industries accrues to organisations that combine AI capabilities with quality proprietary data. This intersection reinforces why the AI CoE is the natural custodian.

The minimum lovable governance principle applies here: the goal isn’t comprehensive data governance bureaucracy, but just enough governance to demonstrate good faith whilst preserving agility. Start with the data assets that matter most strategically, establish clear ownership, implement basic quality monitoring, and build from there based on what works. Governance should be proportionate to risk — customer-facing data supporting AI decisions warrants more attention than internal operational data used for reporting.

Putting it into practice

Begin with visibility. Identify the proprietary datasets unique to your organisation — the data assets competitors cannot easily replicate — and assess their quality using practical metrics: completeness, accuracy, timeliness, consistency. Map where ownership is unclear and where accountability gaps exist. A simple prioritisation matrix helps structure what comes next: plot datasets on axes of strategic value and current quality. High strategic value combined with high quality means leverage and protect. High strategic value combined with low quality means invest in improvement. Low strategic value means proportionate governance only. The goal isn’t comprehensive cataloguing — it’s identifying the data assets that warrant balance-sheet-level stewardship, then asking: which of these, if improved, would unlock the most strategic value?

Benchmark against the 63% of organisations lacking AI-ready data management to reveal gaps in measurement, ownership, and Board-level visibility. Use this baseline to identify where current state falls short of the governance discipline warranted by data’s strategic importance.

A portfolio approach balances quick wins with transformative initiatives. Data cleaning in high-priority domains delivers immediate AI readiness improvements. Quality monitoring dashboards create visibility where none existed. Ownership clarification removes friction from AI initiatives that stall when no one can authorise data access. These foundational improvements enable more ambitious AI integration over time.

The Board imperative

This stewardship mindset shifts Board accountability fundamentally. Boards wouldn’t approve a facilities budget without understanding the condition of their buildings. They wouldn’t make capital allocation decisions without asset condition reports. Yet AI budgets routinely receive approval without equivalent data asset health assessments — investments in capability undermined by foundations no one has examined.

When AI makes a decision, the Board is making that decision. When that decision depends on data quality, data governance becomes Board governance. The invisibility of data on the balance sheet doesn’t diminish the Board’s accountability for its stewardship; it merely obscures the risk until failures surface.

The accounting standards will eventually catch up — but the organisations waiting for that moment will find themselves building AI capabilities on foundations they never thought to examine. The question for Boards isn’t whether data deserves balance-sheet discipline. It’s whether you’ll apply it before the cost of neglect becomes impossible to ignore.

Organisations that build data governance discipline now create foundations for AI advantage. Those that don’t will find themselves investing in AI capabilities undermined by data quality failures — the equivalent of installing advanced machinery in a condemned building.

When accounting standards lag, governance must lead.

Let's Continue the Conversation

Thank you for reading about governing data with balance-sheet discipline. I'd welcome hearing about your Board's experience with data stewardship - whether you're grappling with the visibility gap between data's strategic importance and its governance attention, exploring how to embed data accountability within your AI Centre of Excellence, or finding practical ways to apply stewardship disciplines to assets that accounting standards render invisible.