II.
Workflow overview
Reference · liveworkflow:experiment-reproducibility-review
Experiment Reproducibility Review overview
Validates that computational experiments can be reproduced — checking environment pinning, random seed management, data versioning, and result consistency across runs. Excludes experiment design.
Attributes
displayName
Experiment Reproducibility Review
workflowKind
governance
triggerType
event-driven
typicalCadence
per-experiment
complexity
single-team
description
Validates that computational experiments can be reproduced — checking
environment pinning, random seed management, data versioning, and
result consistency across runs. Excludes experiment design.
Outgoing edges
applies_to_domain2
- domain:scientific-computing·DomainScientific Computing
- domain:data-science·DomainData Science
involves_role2
- role:data-scientist·RoleData Scientist
- role:ml-engineer·RoleMachine Learning Engineer
performed_by_org_unit2
- org-unit:research-engineering·OrgUnitResearch Engineering
- org-unit:ml-team·OrgUnitML Team
requires_skill_area2
- skill-area:python-data-pipelines·SkillAreaPython Data Pipelines
- skill-area:ml-fine-tuning·SkillAreaML Fine-Tuning
triggers_responsibility1
- responsibility:data-quality-monitoring·ResponsibilityData quality monitoring
Incoming edges
follows_workflow1
- stack-profile:research-data-platform·StackProfileResearch Data Platform (Python, Jupyter, PostgreSQL, Boto3, FastAPI, React)