A/B testing and content experimentation methodology for data-driven content optimization. Use when implementing experiments, analyzing results, or building experimentation infrastructure.
Principles and patterns for running effective content experiments to improve conversion rates, engagement, and user experience.
Reference these guidelines when:
Comparing two variants (A vs B) to determine which performs better.
Testing multiple variables simultaneously to find optimal combinations.
The confidence level that results aren't due to random chance.
Making decisions based on data rather than opinions (HiPPO avoidance).
See resources/ for detailed guidance:
resources/experiment-design.md — Hypothesis framework, metrics, sample size, and what to testresources/statistical-foundations.md — p-values, confidence intervals, power analysis, Bayesian methodsresources/cms-integration.md — CMS-managed variants, field-level variants, external platformsresources/common-pitfalls.md — 17 common mistakes across statistics, design, execution, and interpretation