Working in Small Batches
Working in Small Batches is the practice of breaking work down into small, manageable increments that can be delivered, tested, and validated quickly.
It reduces risk, enables faster feedback, improves flow efficiency, and allows teams to learn continuously through rapid iteration.
Level 1 – Initial (Ad Hoc)
Work is delivered in large, infrequent batches.
Changes are bundled together, increasing complexity, risk, and feedback delays.
- Work items often span weeks or months
- Testing and release are deferred until “everything is done”
- Problems are hard to isolate due to batch size
- Feedback loops are slow or missing
- Delivery dates are hard to predict
Level 2 – Managed (Emerging Practice)
Some teams experiment with breaking work into smaller parts, but batch size still varies significantly across teams or products.
- User stories or tasks are sized inconsistently
- Some releases are smaller, but not reliably so
- Business stakeholders may still push for "big bang" delivery
- Smaller batches are seen as a tradeoff, not the default
- Teams may reduce size for tactical reasons, not as a strategic improvement
Level 3 – Defined (Standardised)
Small batch delivery is a defined and encouraged practice across the organisation.
Teams routinely break down work to enable faster flow and earlier validation.
- Work items are consistently sized to be delivered in days, not weeks
- Batch size limits are applied in planning (e.g. no epic without decomposing)
- Features are broken down using thin-slice or MVP techniques
- Stakeholders are aligned on incremental value delivery
- Small batch size improves flow, feedback, and learning
Level 4 – Quantitatively Managed (Measured & Controlled)
Batch size is tracked and used to improve flow predictability and risk management.
Teams use flow metrics to adjust batch size dynamically based on complexity and capacity.
- Metrics include cycle time, lead time, batch size variance, and failure rates
- Teams use flow analytics to identify when batches are too large
- Pipelines and environments support continuous validation of small changes
- Defect rate and rework are analysed relative to batch size
- Business and tech teams collaborate on slicing work around risk and learning
Level 5 – Optimising (Continuous Improvement)
Small batch working is deeply embedded and continuously refined.
Teams adapt batch size based on context, optimise flow, and improve decision quality through faster learning.
- Teams experiment with ultra-small batches (e.g. single-flow items)
- Work is sliced around hypotheses, not just technical tasks
- Observability and analytics allow rapid course correction post-release
- Delivery cadence supports real-time decision-making
- Small batch flow enables continuous delivery and strategic agility