II.
LibraryProcess overview
Reference · livelib-process:data-science-ml--ab-testing-ml
ab-testing-ml overview
A/B Testing Framework for ML Models - Comprehensive framework for designing, executing, and analyzing A/B tests to compare ML model variants with statistical rigor, traffic management, and automated decision-making.
Attributes
displayName
ab-testing-ml
description
A/B Testing Framework for ML Models - Comprehensive framework for designing, executing, and analyzing
A/B tests to compare ML model variants with statistical rigor, traffic management, and automated decision-making.
libraryPath
library/specializations/data-science-ml/ab-testing-ml.js
specialization
data-science-ml
references
- - Trustworthy Online Controlled Experiments: https://experimentguide.com/ - Microsoft Experimentation Platform: https://exp-platform.com/ - Optimizely Stats Engine: https://www.optimizely.com/insights/blog/stats-engine/ - Netflix Experimentation: https://netflixtechblog.com/its-all-a-bout-testing-the-netflix-experimentation-platform-4e1ca458c15 - Spotify Experimentation: https://engineering.atspotify.com/2020/10/spotifys-new-experimentation-platform-part-1/
example
const result = await orchestrate('specializations/data-science-ml/ab-testing-ml', {
projectName: 'Recommendation Engine A/B Test',
modelA: {
name: 'content-based-v1',
version: '1.2.0',
endpoint: 'https://api.example.com/models/content-based-v1'
},
modelB: {
name: 'collaborative-filtering-v2',
version: '2.0.0',
endpoint: 'https://api.example.com/models/collaborative-v2'
},
targetMetric: 'click_through_rate',
minimumSampleSize: 10000,
confidenceLevel: 0.95,
trafficSplit: { a: 50, b: 50 }
});
usesAgents
- general-purpose
Outgoing edges
lib_applies_to_domain1
- domain:data-science·DomainData Science
lib_belongs_to_specialization1
- specialization:data-science-ml·Specialization
lib_implements_workflow2
- workflow:code-review·Workflow
- workflow:ml-model-lifecycle·WorkflowML Model Lifecycle
Incoming edges
None.