Minimum Viable Experiment
The smallest, fastest experiment that can validate or invalidate a hypothesis, stripping away unnecessary complexity to accelerate the learning cycle.
What Is a Minimum Viable Experiment?
The minimum viable experiment (MVE) applies lean thinking to experimentation itself. Instead of building a polished variant with full engineering support, an MVE asks: "What's the cheapest, fastest way to learn whether this hypothesis has merit?" Often, the answer is simpler than teams expect.
Most teams over-engineer their experiments. MVE flips the default: test directionally first, invest in precision only if the direction looks promising.
Also Known As
- Marketing: Smoke test, fake door test
- Sales: Quick validation, pilot pitch
- Growth: MVE, lean experiment, smallest testable experiment
- Product: Painted door test, Wizard of Oz test, prototype test
- Engineering: Spike, proof of concept, throwaway prototype
- Data: Pilot analysis, quick validation study
How It Works
A team is considering building an integrations marketplace — a 3-month engineering effort. Before committing, they run an MVE: add a "Coming Soon: Integrations" link to the nav that leads to a form asking which integrations users want. In two weeks, the click data and form responses reveal that 80% of demand concentrates on three specific integrations, and overall click-through is high enough to justify the investment.
Without the MVE, they would have built 15 integrations based on executive guesses. With it, they ship three integrations that cover 80% of demand and save 60% of the engineering budget.
Best Practices
- Ask the MVE question first: "Is there a cheaper way to test this hypothesis?"
- Use painted doors to validate demand before building.
- Use Wizard of Oz tests to validate value before automating.
- Accept lower fidelity when the hypothesis doesn't require full fidelity.
- Use MVE results to gate full investment, not as a substitute for rigorous testing.
Common Mistakes
- Using MVEs where full fidelity is required — you can't MVE a performance improvement.
- Treating MVE results as definitive — they inform go/no-go decisions, not final design.
- Skipping MVE because "we already decided to build it" — sunk cost thinking kills MVE discipline.
Industry Context
SaaS/B2B: MVEs are especially powerful for B2B feature validation. Painted door tests on "advanced" features reveal demand before expensive engineering investment.
Ecommerce/DTC: MVEs validate new product categories, checkout changes, or loyalty programs before committing to infrastructure.
Lead gen: MVEs work well for testing new lead magnets and landing page concepts using simple tools rather than custom builds.
The Behavioral Science Connection
MVEs counter the sunk cost fallacy — the tendency to continue investing in something because of prior investment. By keeping initial investment small, MVEs make it psychologically easier to abandon bad ideas. They also counter the planning fallacy by revealing the minimum work needed to learn.
Key Takeaway
MVEs typically double or triple effective test velocity by eliminating the engineering bottleneck for hypothesis validation — learning fast and investing only where direction is confirmed.