Comprehensive Process Intelligence — December 2025
Executive Summary
Analysis of 12 parliamentary documents (9 oral evidence sessions, 3 written submissions) reveals systematic governance implementation gaps between UK AI policy rhetoric and delivery reality. Seven distinct patterns of announcement-implementation disconnect now documented across: compute infrastructure, policy publication, institutional coordination, skills development, data infrastructure, legacy systems, and regulatory frameworks.
PART 1: THE SEVEN IMPLEMENTATION GAPS
1. COMPUTE INFRASTRUCTURE GAP
- Julia Lopez Written (Dec 2024): “DSIT developing ‘long-term plan’ aligned with AI Action Plan”
- Lord Vallance Oral (Feb 2025): Confirmed no funding allocated, multiyear spending review “next year”
- Current Status: AIRR launching early 2025 (Bristol, Cambridge), Isambard-AI operational (5,500+ Nvidia GPUs)
- Reality Check: Government substituted unfunded exascale promise with smaller AIRR delivery
2. POLICY PUBLICATION GAP
- AI/Creative Tech Supplementary (Dec 2024): “Hopes to publish shortly”
- Science Minister Oral (Feb 2025): “Hopes to publish shortly” (8 months post-commission)
- Digital & AI Road Map: Originally promised summer 2025, delayed — “coming shortly”
- Digital Centre Oral (Nov 2025): “Delayed due to new Secretary of State priorities”
3. INSTITUTIONAL PROLIFERATION GAP
- AI Safety Institute (AISI) — £100M budget, statutory footing via AI Bill
- Central AI Risk Function (CAIRF) — 23 FTE, risk assessment/mitigation
- Regulatory Innovation Office (RIO) — Chair to be appointed, 4 priority areas
- Foundation Model Taskforce — £100M seed funding
- AI Opportunities Unit — To be established
- Incubator for AI (i.AI) — Now in GDS, productivity focus
- Digital Regulation Cooperation Forum (DRCF) — ICO, CMA, Ofcom, FCA
- AI/Digital Regulations Service — Health sector (CQC, HRA, MHRA, NICE)
- Science & Technology Committee — PM-chaired Cabinet sub-committee
- Digital Commercial Centre of Excellence (procurement)
- UK Strategic Public Investment Forum (Innovate UK, BBB, NWF)
- AI Accelerator Programme (civil service upskilling)
- Responsible Tech Adoption Unit (ethics/transparency)
4. SKILLS DEVELOPMENT GAP
- PAC Oral Evidence: AI skills among civil servants at 5.4{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} vs 6{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} target
- AI Accelerator Programme: First cohort 25 graduated, second in progress
- Science Minister Oral: “TechTrack upskilling apprenticeship programme” expansion planned
- Digital Centre Written: 32,000 members Government Digital and Data Profession, “major recruitment campaigns targeting most important skills gaps”
- Civil service AI skills growth: 0.6 percentage points below modest 6{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} target
- Global Talent Visa applications: 7,000-8,000 (10{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} increase) — incremental not transformational
- Contractor dependency remains: Average contractor costs 3x public servant salary
- Skills remuneration “uncompetitive with private sector” (State of Digital Government Review)
- Treasury constraints on competitive salaries
- Home Office visa restrictions creating bureaucratic barriers
- Cultural resistance to non-traditional recruitment in Whitehall
- Time lag between programme launch and skilled workforce availability
5. DATA INFRASTRUCTURE GAP
- AI/Creative Tech Supplementary (Dec 2024): “Work underway to design” — “decisions on design and implementation in due course”
- Digital Centre Written (March 2025): “Still in early stages of development with key decisions to be made”
- Purpose Stated: Unlock public data for researchers, policymakers, business, AI developers
- Current Status: Extensive engagement with experts, “learning lessons from previous data initiatives”
6. LEGACY SYSTEMS GAP
- Digital Centre Oral (Nov 2025): 75{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} of government systems now identified as legacy
- Definition: Unsupported software, expired vendor contracts, no knowledge/skills to operate, obsolete hardware
- Previous Approach: Only red-rated systems reported to GDS
- Trend: Count of highest risk/critical systems rose 26{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} from 2023 to 2024
- New Approach: Automating data feeds for “better moving picture”
- £20B digital/data/tech spending review includes legacy remediation funding
- DEFRA piloting AI to speed recoding/remediation
- Vulnerability scanning: 600+ organisations, fixing 100+ critical vulnerabilities monthly
- Departments prioritise new programmes over maintenance
- Technical debt compounds faster than remediation
- Siloed procurement creates new legacy through fragmented solutions
- No cross-government enforcement mechanism for standards compliance
7. REGULATORY FRAMEWORK GAP
- RIO Establishment: Chair to be appointed, 4 priority areas
- Priority Areas: AI in healthcare (MHRA), drones/autonomy (CAA), cultured meat (FSA), general new tech
- Approach: Practical unblocking of specific problems, not systemic reform
- Science Minister Oral (Feb 2025): “Unashamedly practical” — picking specific problems in particular areas
- Out-of-line-of-sight drone flying regulatory sandbox
- Cultivated meats regulatory framework
- 84 barriers identified for individual companies (e.g., Scarlet synthesised blood transport regulations)
- Political need for visible progress over systemic change
- Recognition that comprehensive regulatory reform is too complex
- Hope that small wins create momentum for broader change
- Risk: Fragmented fixes don’t address structural regulatory incoherence
PART 2: FUNDING & INVESTMENT INTELLIGENCE
UK AI Investment Landscape
- VC Ecosystem: Healthiest in Europe, £72B raised 2021-23, third largest globally (after US/China)
- Tech Sector: Largest in Europe, top 5 globally
- Unicorns: 161 since 2000 (more than France, Germany, Sweden combined)
- 2023 Funding: £21.3B to UK tech startups (vs France £9.2B, Germany £8.2B)
- AI Sector: £14.2B revenue 2023, projected £800B+ by 2035
- Pension Fund Problem: UK lags G7 in pension fund investment in domestic tech/science companies
- Scale-Up Finance Gap: Acute shortage of growth capital for companies moving from start-up to scale-up
- US Capital Dominance: Increasing US money in UK VC creates pressure for “Delaware flip”
- Public Markets: LSE not attractive for tech listings compared to NASDAQ
- Mansion House Compact (2024)
– 17 largest pension providers (increased from initial 11) – 10 of first 11 already developing internal skills – 8 have clear investment plans – Target: 10{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} of funds in growth assets by 2030 – Reality Check: £4B to British Business Bank to cornerstone investments, but actual flow still “slow trickle”
- British Growth Partnership
– BBB-led vehicle to mobilise institutional capital – First investments expected September 2025 – Mandate to lead and reduce risk for pension funds – Challenge: Amounts small compared to Ontario Teachers’ Fund benchmarks
- R&D Tax Credits
– £7B annually (effectively third of science spend, not in £20.4B headline) – 20{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} rate (joint highest uncapped in G7) – Companies receive £15-£27 per £100 spent on qualifying R&D – Enhanced Support for R&D Intensive SMEs (ERIS): £1.3B p.a. to ~20,000 SMEs
“We have all lived through 30 years of this not happening. It is now clear that there is a plan in place to make this happen. There is some action. It is not as fast as those of us who are impatient want to see happen, but it is happening.”
- Ministers acknowledge 30-year failure but defend incremental progress
- Expectation of “slow trickle followed by sudden acceleration” in pension fund investment
- No timeline for acceleration, no willingness to mandate investment
- Strategy: Make UK attractive, not compel investment
- Risk: Market forces may not deliver at required pace for growth mission
Procurement as Innovation Driver
- MoD: 10{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} of defence equipment expenditure on novel technologies
- UK Defence Innovation: £400M set aside
- Ministerial View: “Biggest example to date of where procurement can really make a difference”
- Contracts for Innovation (Innovate UK)
– Procuring R&D to answer technical issues in departments – Pre-commercialisation phase – Limitation: Niche part of procurement pool, not scaled across government
- R&D Missions Accelerator Programme
– £500M to tackle missions board problems – Model: Define problem → R&D solution → Procurement commitment – Example problems: Detect high knife crime areas (non-invasive), AI in electricity grid
- Commercial Innovation Hub (Cabinet Office)
– Remit: Get innovation from SMEs embedded in procurement – Early stage, effectiveness unclear
- Health Innovation Zones
– NHS procurement pull-through for digital/tech – Innovator passports for faster adoption – Regional pilots before national rollout
- Risk Aversion Culture:
– NAO/PAC historically punish failure – Minister Lord Vallance: “If you are making a new medicine, when you start you have something like a 2{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} success rate. You just need to accept that” – NAO now claims acceptance of risk profiles presented upfront, but not yet tested in practice
- Fragmented Buying Power:
– £26B annual digital/tech procurement – Despite central frameworks, organisations contract locally – Value for money diluted by duplication
- Departmental Variation:
– MoD and Health leading with innovation procurement – Most departments still risk-averse, traditional procurement – No cross-government mandate for innovation procurement targets
- Announce bold targets (MoD 10{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05})
- Create coordination bodies (Commercial Innovation Hub)
- Hope success spreads organically
- No enforcement mechanism for laggard departments
- Result: Pockets of excellence, systemic underperformance
PART 3: THE OXFORD-CAMBRIDGE GROWTH CORRIDOR CASE STUDY
- All-of-Government Team:
– People seconded from relevant departments – Do NOT come with departmental hat on – Come as team members with shared mission
- Permanent Secretary Access:
– Each team member can access their home department Perm Sec directly – Can free up departmental resources/decisions
- Ministerial Group:
– Ministers from each relevant department – Chaired by Lord Vallance – Collective responsibility for corridor success
- Direct Prime Minister Line:
– Lord Vallance can go straight to PM – Alongside Chancellor relationship
- Accountability Structure:
– Cabinet Secretary pulled Perm Secs together – Made corridor success part of their personal objectives
- Spending Review Integration:
– Team submitted coherent bid across departments – Ensured departmental bids added up to corridor strategy
- Digital Twin Analytics:
– Data analytics across all aspects (transport, utilities, lab space, skills) – Can model impact of delays or accelerations – Integrated planning tool
“Often the problem is that everyone agrees in principle but of course, while it is someone’s 15th priority, it is someone else’s second and someone else’s fifth, yet you need a coherent priority for the whole thing otherwise it does not work.”
- Authority Not Coordination: Permanent Secretary objectives + PM access = real power
- Integrated Not Siloed: Single team, single budget submission, single planning tool
- Delivery Not Advisory: Explicit focus on making things happen, not recommending action
- Data-Driven: Digital twin allows evidence-based priority tradeoffs
PART 4: AI REGULATORY APPROACH — THE “USE NOT DEVELOPMENT” MODEL
UK’s Distinctive Regulatory Position
“AI Safety Institute to statutory footing via AI Bill. Bill limited to frontier AI work. Most AI regulated at point of use (application). Consultation to be launched.”
- EU: Comprehensive AI Act regulating development and deployment
- UK: Light-touch frontier models, heavy regulation at application point
- Status: Working toward harmonisation with EU but maintaining distinct approach
- AI Safety Institute (AISI)
– Focus: Frontier AI safety research – Statutory basis via AI Bill – £100M budget over 2 FY – Scope: Most powerful AI models only
- Application-Point Regulation
– Individual sector regulators handle AI in their domains – Health: CQC, HRA, MHRA, NICE – Finance: FCA – Communications: Ofcom – Competition: CMA – Data: ICO
- Cross-Sector Coordination
– Digital Regulation Cooperation Forum (DRCF) – AI and Digital Hub trial ending 2025 – Limited enforcement power, relies on voluntary cooperation
Minister Lord Vallance (Feb 2025) cited Chinese AI company DeepSeek as evidence:
“Shows competition growing, not 2-3 companies monopoly”
- UK monitoring global AI competition actively
- Sees Chinese AI advancement as validating non-restrictive approach
- Believes market competition will constrain any single player dominance
- Not concerned about concentration risk in same way as EU
Standards Before Regulation
“Standards first, then think about what to regulate is exactly what the technologies and sectors are looking for when it comes to the UK’s approach to pro-innovation regulation.”
- UK first country to publish intended approach to quantum regulation (2024)
- Principle: Go as far as possible through norms and international standards
- Regulate only at point where necessary
- Method: Allow technology to flourish while preventing harm
- Industry lobbying success — delay binding regulation
- Genuine uncertainty about optimal regulatory frameworks for emerging tech
- Fear of over-regulation driving innovation overseas
- Hope that standards + light regulation = competitive advantage
PART 5: DIGITAL TRANSFORMATION — THE £20B QUESTION
State of Digital Government Review Findings
- Leadership Failure
– Little reward for prioritising service digitisation, reliability, risk mitigation – Leaders not paid, promoted, or valued for digital agenda – Digital leaders not consistently at senior levels – Lack power to shape strategic agenda
- Structural Fragmentation
– Public sector organisations are independent bodies – Limited mechanisms to contract services from each other – Most build and maintain own technology estate – Inhibits standardisation, interoperability, reuse, scale benefits
- Measurement Absence
– No consistent digital performance measurement – Service quality, cost, risk, change delivery not tracked cross-sector – Cannot recognise high performance – Cannot identify organisations needing help – Cannot make cross-sector strategic decisions
- Talent Crisis
– Compensation and career progression uncompetitive with private sector – Especially acute for senior leaders – Hard to attract and retain top digital/data talent – No integrated cross-public sector workforce strategy – Cannot respond strategically to resource/skills gaps
- Funding Dysfunction
– Spend biased toward new programmes – Insufficient prioritisation of existing systems operation/maintenance – Legacy assets underfunded – Digital services need committed, sustained funding – Current model: programme-based, suited to physical infrastructure not digital services
Government Response — Six-Point Plan
- Join Up Public Sector Services
– People interact with 40+ different services across 9 organisations (long-term health condition example) – GOV.UK App and Wallet for personalised experiences – “Get Britain Working” kickstarter addressing cross-department services
- Harness AI for Public Good
– i.AI building/testing AI tools for productivity – £45B annual potential improvement identified – GOV.UK Chat (LLM-powered) for business users – AI Accelerator upskilling civil servants
- Strengthen Digital/Data Infrastructure
– GOV.UK One Login expansion – National Data Library (still in design) – Digital Backbone for API integration – Vulnerability scanning across 600+ organisations – Single unique identifier for children
- Elevate Leadership, Invest in Talent
– All public sector organisations must have digital leader on executive committee by 2026 – Digital non-executive director on boards by 2026 – Government Chief Digital Officer role raised to Second Permanent Secretary level – Digital Hub in Manchester
- Fund for Outcomes, Procure for Growth
– £26B annual digital/tech procurement – Digital Commercial Centre of Excellence – Streamline governance, enable agility – Whole-of-public-sector agreements for platform services (cloud)
- Commit to Transparency, Drive Accountability
– Departments publish metrics annually (service performance, value for money, resilience, AI adoption) – Digital Inter-Ministerial Group reviews – Secretaries of State held accountable
- £20B for digital/data/tech (double previous SR)
- £1.2B for GDS over spending review period
- £2B for AI opportunities (across departments for procurement pull-through)
- Government diagnosed all five root causes accurately
- Six-point plan addresses symptoms not causes
- No structural reform of fragmented public sector
- No solution to talent compensation gap
- No enforcement mechanism for mandated changes
- Relies on transparency and accountability without consequences for failure
PART 6: INTERNATIONAL POSITIONING & LEARNING
UK Global Rankings
- OECD Digital Government Index (2024): 3rd of 38 countries
- UN E-Government Index (2024): 7th of 193 member states
- Global Innovation Index (2024): 5th of 133 economies (2nd among G7)
- Market sophistication: 3rd globally
- Knowledge and technology outputs: 5th globally
- Human capital and research: 7th globally
- 4 of world’s top 10 universities
- Net satisfaction with digital government services: 79{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} (decade ago) → 68{e31bf911d06dd91ac4b0846a01926c6e0cba1b3752e1873aecb4a21b5e07de05} (now)
- User expectations rising faster than service improvements
International Learning Examples
- All legislation “digital ready” (reduces service delivery complexity)
- Investments in resilient infrastructure
- AI, cloud computing, broadband deployment
- UK adopting: Digital-ready legislation principle
- Digital Government Masterplan
- “Ask citizen information once only” principle
- Strong data utilisation
- AI deployment in public services
- UK has MoU: Sharing expertise on data utilisation, AI deployment
- Trilateral partnership (Estonia-Ukraine-UK)
- Innovative AI uses
- Delivering services via apps (pre-GOV.UK App)
- Robust data sharing architecture
- Risk noted: Estonia and Germany both suffered data breaches (UK learning lessons)
- Bilateral engagement under MoU
- Digital Government Exchange (DGX) participation
- Specialised working groups: cloud, AI, cyber
- Resource exchanges for capability building
- Pension funds investing in UK science/tech
- Lord Vallance observation: “Canadian pensioners have done very well out of science and technology in the UK. Good for them, but we need to do the same here.”
- OECD: Data use in public sector
- UN: Supporting local level capabilities
- World Bank: Specialised working groups
- Imports good ideas (digital-ready legislation from Denmark)
- Creates MoUs and exchange programmes
- Produces reports and frameworks
- Gap: Translating international learning into domestic delivery
PART 7: ISAR GLOBAL STRATEGIC INTELLIGENCE
Three Governance Realities Confirmed
Seven documented patterns show this is not episodic failure but structural feature:
- Compute infrastructure (exascale unfunded)
- Policy publication (6-8 month delays)
- Institutional coordination (9 bodies, no convening authority)
- Skills development (below target despite programmes)
- Data infrastructure (perpetual design phase)
- Legacy systems (accelerating despite remediation)
- Regulatory frameworks (tactical not strategic)
- Investors assessing policy certainty
- Companies planning market entry
- International organisations evaluating UK partnership reliability
At least 9 AI governance bodies created since 2023:
- Each with separate budget, governance, reporting lines
- No single entity with convening authority
- DSIT attempts coordination but lacks enforcement power
- Result: Coordination by committee, not by authority
- Ministers announce new bodies in response to problems
- Existing bodies not reformed or consolidated
- Each new body creates need for more coordination
- More coordination creates more coordination bodies
- Net Effect: Institutional complexity increasing, coordination effectiveness declining
- £20.4B R&D budget 2025/26
- £20B digital/data/tech (SR to 2027)
- £100M AISI (FY23/24-24/25)
- £100M Foundation Model Taskforce
- £500M R&D Missions Programme
- £7B R&D tax credits
- Exascale unfunded
- AI Action Plan unpublished 6+ months
- National Data Library in “design stage”
- Legacy systems worsening despite remediation funding
- Skills targets unmet
- Pension fund investment still “slow trickle”
- New initiatives (politically visible)
- Not operational costs (politically invisible)
- Not legacy remediation (no political credit)
- Not coordination mechanisms (no tangible output)
- Announced funding: What ministers claim
- Allocated funding: What appears in budgets
- Spent funding: What organisations actually receive
- Effective funding: What achieves stated outcomes
This four-level analysis reveals governance reality versus governance rhetoric.
Parliamentary Intelligence Opportunities
Current ISAR Global capability: Tracking 7,250+ AI-related parliamentary questions
- Link parliamentary questions to ministerial responses
- Track response times (FOI-equivalent accountability)
- Identify evasive language patterns (“in due course”, “hopes to publish shortly”)
- Map which questions get substantive versus procedural answers
- Correlate question patterns with policy announcements
- Early warning system for policy problems (question volume surge)
- Minister accountability tracking (commitment versus delivery)
- Opposition strategy intelligence (what they’re probing)
- Media attention predictors (questions that generate press coverage)
- Written evidence: Mentions exascale, no problems flagged
- Oral evidence: Ministerial admission “announced without funding”
- Oral testimony more candid than written submissions
- Cross-examination reveals implementation challenges
- Follow-up questions expose contradictions
- Witness selection indicates government priorities
- Systematic analysis of witness selection patterns
- Track which organisations get invited repeatedly
- Compare written vs. oral evidence contradictions
- Map follow-up question evolution
- AISI → DSIT → Ministers → Cabinet
- CAIRF → Lead Government Departments → Risk Owners
- RIO → Regulators → Sectors
- i.AI → GDS → DSIT → Departments
- DRCF → Individual Regulators → Policy Coordination
- Show where coordination fails structurally
- Identify power vacuums in governance architecture
- Predict institutional conflicts
- Assess policy implementation feasibility
- Announcement: Ministerial speeches, press releases
- Allocation: Spending Review documents, departmental budgets
- Distribution: UKRI Gateway to Research, departmental breakdowns
- Delivery: FOI requests, NAO reports, evaluation studies
- Announced: £500M
- Allocated: Confirmed in SR
- Distribution: “First four problem statements and programme leads in place”
- Delivery: Not yet assessed (too early)
- Four-stage gap analysis shows where funding stalls
- Correlate announcement-delivery gaps with political priorities
- Predict which commitments will fail to materialise
Julia Lopez Meeting — Strategic Positioning
- Infrastructure sovereignty challenges
- SME competitiveness barriers in AI sector
- Opposition need for evidence-based policy alternatives
- Conservative Party rebuilding policy capacity post-election
- Process Intelligence She Doesn’t Have:
– Seven implementation gaps documented with parliamentary evidence – Institutional proliferation mapping (9 bodies, coordination failures) – Funding rhetoric vs. delivery reality tracking – International comparison on implementation not just policy
- Opposition-Relevant Intelligence:
– Government’s own evidence of delivery failures – Ministerial admissions in oral testimony – Cross-party consensus on problems (Select Committee reports) – Alternative approaches that could differentiate Conservative policy
- Ongoing Intelligence Capability:
– Weekly parliamentary question monitoring – Select Committee evidence analysis – Real-time policy publication tracking – International governance developments
- Demonstrate Distinctive Value:
– Show seven implementation gaps with evidence trail – Explain why no one else does governance process intelligence – Illustrate international comparison methodology
- Establish Ongoing Relationship:
– Offer monthly parliamentary intelligence briefing – Provide early warning on policy developments – Support Opposition policy development with evidence base
- Position for Long-Term:
– ISAR Global as “governance reality intelligence” – Not policy advocacy, not lobbying – Independent evidence-based analysis – Valuable to whichever party is in government
“You need intelligence on what governments actually deliver versus what they promise. That’s what ISAR Global provides. Not what policies should be, but what policy implementation patterns reveal about governance effectiveness.”
CONCLUSION: THE GOVERNANCE INTELLIGENCE OPPORTUNITY
UK parliamentary processes provide extraordinary transparency into government decision-making. Unlike most democracies, the UK publishes:
- All written ministerial responses
- All parliamentary questions and answers
- Full oral evidence transcripts from Select Committees
- Departmental submissions to inquiries
- Government responses to committee reports
This creates a comprehensive audit trail of governance commitments and delivery.
Most analysis focuses on:
- What policies government announces (political commentary)
- Whether policies are good ideas (policy advocacy)
- How policies compare to other countries (benchmarking)
- What governments commit to versus what they deliver
- How institutional architectures enable or prevent coordination
- Where funding rhetoric exceeds implementation capacity
- Which governance patterns predict failure or success
They demonstrate:
- Pattern recognition across policy areas
- Evidence-based documentation using parliamentary sources
- International comparison on implementation not just policy design
- Predictive capability for governance failure
This is governance process intelligence — and there is no other organisation systematically providing it.
- Julia Lopez Meeting (December 2025):
– Present seven implementation gaps analysis – Demonstrate parliamentary intelligence methodology – Establish ongoing briefing relationship
- Publication Strategy:
– Consider publishing summary of seven gaps (validates ISAR Global methodology) – Share detailed analysis with strategic partners – Use as demonstration of governance intelligence capability
- Parliamentary Intelligence Expansion:
– Systematic oral evidence analysis (remaining 54 documents in archive) – Cross-committee pattern recognition (compare DSIT, JCHR, PAC findings) – International comparison integration (how other parliaments track government delivery)