Phase 3 – Capture & Validate Data
1.0 Introduction
Having identified the right metrics, the next step is ensuring that reliable, timely, and trustworthy data exists to support them. This phase focuses on how performance data is gathered, verified, and prepared for use in analysis and reporting.
Inconsistent, inaccessible, or manipulated data undermines trust and derails improvement efforts. Phase 3 is where SPARA protects the integrity of insight by emphasising data quality, transparency, and traceability.
This phase also addresses a common challenge in fragmented organisations: gaining access to the right data sources and understanding their limitations.
2.0 Purpose and Objectives
Purpose
- 
Capture performance data from the most appropriate sources with integrity and traceability 
- 
Validate that the data is complete, accurate, and usable for reporting and decision-making 
Objectives
- 
Ensure each metric has a well-defined, accessible data source 
- 
Verify data quality and reliability 
- 
Identify gaps, risks, or dependencies in the data pipeline 
- 
Enable scalable, secure, and sustainable data flows across services and teams 
3.0 Inputs, Outputs, Tools and Techniques
Inputs
- 
Defined performance metrics (from Phase 2) 
- 
Existing reporting systems or analytics platforms 
- 
Access logs, API definitions, service records, manual trackers 
- 
Data ownership and access control policies 
Outputs
- 
Validated data sources per metric 
- 
Data quality assurance log (completeness, timeliness, accuracy) 
- 
Data access agreements or integration specifications 
- 
Flagged risks, workarounds, or known limitations 
Tools and Techniques
- 
Data Readiness Checklist: Verifies source, format, frequency, and permissions 
- 
Data Quality Framework: Applies completeness, accuracy, timeliness, and consistency criteria 
- 
System Access Register: Logs integration/API points and responsible owners 
- 
Data Stewardship Map: Aligns owners with controls and governance 
- 
Data Validation Workshops: Run collaboratively to test assumptions and ensure alignment 
4.0 Process Steps (Activity Breakdown)
Step 1: Source Identification and Mapping
- 
Identify where each metric’s data will come from (tool, process, system, manual log) 
- 
Engage system/data owners early to establish feasibility and access 
Step 2: Access and Integration Assessment
- 
Confirm access credentials, security permissions, and integration mechanisms 
- 
Log any required API calls, export jobs, or manual pull processes 
- 
Identify risk of dependency on individuals, spreadsheets, or shadow systems 
Step 3: Validate Data Quality
- 
Use the Data Quality Framework to score each source on: - 
Accuracy: Is the data correct and validated? 
- 
Completeness: Is the full picture captured? 
- 
Timeliness: Is the data fresh enough to inform action? 
- 
Consistency: Are definitions and formats uniform? 
 
- 
Step 4: Flag Gaps and Define Workarounds
- 
Identify metrics without viable sources 
- 
Document temporary workarounds, proxies, or estimates 
- 
Set remediation or data improvement actions where appropriate 
Step 5: Approve for Use
- 
Formally approve validated data sources for inclusion in reports and analysis 
- 
Record data lineage (how, where, and when data is pulled) 
- 
Update the performance registry to reflect approved, rejected, or pending data sources 
5.0 Role Examples (RACI by Org Size)
| Org Type | Responsible | Accountable | Consulted | Informed | 
|---|---|---|---|---|
| Solo | Owner | Owner | Data advisor | Stakeholder/client | 
| SME | Analyst or Ops Lead | Service Owner | IT/Systems, Legal | Team Leads | 
| Enterprise | Data Engineer or BI Lead | Data Governance Board | InfoSec, IT, Compliance | Exec Governance Forums | 
6.0 Tips for Deployment
- 
Build relationships with system and data owners early — they will make or break this phase 
- 
Avoid over-reliance on spreadsheets or manually maintained trackers 
- 
Where data doesn’t exist, be honest — don’t force unreliable proxies 
- 
Use early dashboards or data extracts as prototypes to test usability 
- 
Track data issues over time and create a backlog of data quality improvements 
7.0 Example Data Capture Scenarios
- 
Pulling SLA logs from ITSM tools 
- 
Collecting customer sentiment from feedback surveys 
- 
Scraping ticket flow metrics from workflow engines 
- 
Manual reporting of team retrospectives or subjective XLAs 
- 
Integrating NPS, cost, and availability data across vendors in a SIAM model 
8.0 Integration and Interoperability
- 
COBIT (BAI03/MEA01): Ensures traceability, data governance, and compliance 
- 
ITIL (Measurement & Reporting): Maps to enabling accurate and meaningful service reporting 
- 
GDPR & Data Protection: Must respect personal data handling and retention policies 
- 
DevOps Toolchains: Can offer high-quality, real-time data if aligned early 
9.0 Lever Activation Guidance
- Delivery & Assurance: Validates that measurement is trustworthy and fit for reporting
- Governance & Alignment: Ensures data use complies with policy and is assigned to responsible owners
- People & Empowerment: Ensures users understand and can trust the data they’re interpreting
