Hypothesis-Driven Development (HDD)
Hypothesis-Driven Development (HDD) is a product and engineering approach where work is framed around testable hypotheses rather than assumptions.
It encourages learning through experimentation, reduces waste, and enables teams to build only what is needed to achieve outcomes.
This practice shifts the focus from delivering outputs to delivering validated outcomes.
Level 1 – Initial (Ad Hoc)
Decisions are made based on assumptions, intuition, or senior opinion.
There is no structure for articulating or testing what the team believes will work.
- Requirements are treated as fixed outputs to be delivered
- Success is measured by delivery of features, not impact
- Learning is reactive and informal
- Teams may build full solutions without testing value
- There is limited ability to respond to unmet or invalidated needs
Level 2 – Managed (Emerging Practice)
Some teams begin to articulate assumptions and outcomes, but hypotheses are not formally structured or measured.
- Problem framing may happen, but solution bias remains high
- Teams explore MVPs or early tests, but not consistently
- Hypotheses may be written down, but often lack clear success metrics
- Work is occasionally pivoted based on late feedback
- HDD is seen as a UX or product practice, not cross-functional
Level 3 – Defined (Standardised)
Hypothesis framing is part of standard delivery practice.
Teams define, test, and validate hypotheses in collaboration with stakeholders before committing to full-scale delivery.
- Work items are structured around hypotheses, not fixed requirements
- Hypotheses include metrics, timeframes, and methods of validation
- MVPs, feature flags, and early rollouts are used to test assumptions
- Teams iterate based on results and adapt quickly to new insights
- Product, design, and engineering collaborate on defining experiments
Level 4 – Quantitatively Managed (Measured & Controlled)
Hypothesis-driven development is supported by feedback systems, analytics, and governance.
Teams use data to prioritise and optimise work based on validated learning.
- Experiments are prioritised based on value, risk, and impact
- Hypothesis tracking is integrated into planning and delivery tools
- Experiment velocity and learning effectiveness are monitored
- Failures are analysed and used to improve future hypothesis framing
- Stakeholders use evidence from experiments to make investment decisions
Level 5 – Optimising (Continuous Improvement)
HDD is part of the organisation’s innovation engine.
Learning is continuous, and decision-making is data-informed at every level.
- Teams form hypotheses as default for any significant change
- Multiple hypotheses are tested in parallel to optimise learning
- Results are shared widely and feed into collective knowledge
- Leadership supports experimentation through governance, budget, and culture
- HDD practices evolve through organisational learning and external benchmarking