workflow:data-quality-investigation
Data Quality Investigation overview
Investigates data quality anomalies and drives remediation -- triaging data quality alerts from automated monitoring or stakeholder reports, profiling affected datasets to characterize anomaly scope and impact, tracing root cause through data lineage from source systems through transformation layers, assessing downstream impact on reports, dashboards, and business decisions, implementing corrective data fixes and validating against expected values, recommending preventive controls including schema validation, freshness checks, and distribution monitoring, and communicating impact assessment and remediation timeline to affected stakeholders. Produces root-cause analysis, impact assessment, and prevention recommendations. Excludes data pipeline refactoring and monitoring tool configuration.
Attributes
Outgoing edges
- domain:data-engineering·DomainData Engineering
- domain:business-intelligence·DomainBusiness Intelligence
- role:data-analyst·RoleData Analyst
- role:data-engineer·RoleData Engineer
- role:business-analyst·RoleBusiness Analyst
- org-unit:data-platform-team·OrgUnitData Platform Team
- org-unit:data-governance-team·OrgUnitData Governance Team
- skill-area:data-quality·SkillAreaData Quality
- skill-area:python-data-pipelines·SkillAreaPython Data Pipelines
- responsibility:data-quality-monitoring·ResponsibilityData quality monitoring
- responsibility:postmortem-writeup·ResponsibilityPostmortem writeup