Nautis Logo
Product · Discovery

How to Run Customer Interviews That Actually Work

The Mom Test boiled down to a playbook. What to ask, what to ignore, and how to extract signal from polite enthusiasm.
8 min readUpdated Apr 30, 2026

Most founder customer interviews fail the same way: the founder asks leading questions, gets polite enthusiasm, walks away convinced the product is needed, and ships a feature nobody actually wants. Rob Fitzpatrick named the pattern in The Mom Test: people lie to be nice. The fix is asking different questions.

The single rule

Never ask hypotheticals about your product. Always ask about their past behavior.

Bad: "Would you use a tool that automates X?" — invites flattery.

Good: "How do you handle X today? Walk me through what you did last time." — invites truth.

This rule is annoyingly simple and unreasonably effective. Apply it to every question you write, and the quality of your interviews triples.

The three questions worth asking

  1. "Walk me through how you do X today." Concrete process, real frequency, real frustrations. Note the specific tools, time investment, and pain points.
  2. "What have you tried to fix this?" If they haven't tried anything, the problem isn't urgent enough to pay for. If they've tried 3 different tools, you have a market.
  3. "Walk me through the last time it really hurt." A specific incident, not an abstraction. Real pain produces real stories. No specific story = the pain isn't acute.

Signals to watch for

Strong signals — keep going:

  • Specific recent stories. "Last Tuesday this thing broke and I was on the phone for 3 hours."
  • They've already paid for a partial solution. Existing budget = real pain.
  • They volunteer follow-ups. "Can I introduce you to my friend Sarah, she has the same problem worse than I do."
  • They want a demo without prompting. "When can I see what you're building?"

Weak signals — be skeptical:

  • "That's a great idea, you should build it." Almost always meaningless. People are encouraging because they're polite.
  • "I would definitely use it." Future-tense willingness to use is the weakest possible signal. Past-tense willingness to pay is the strongest.
  • "How much would you charge?" Pricing curiosity before any product reveal usually means they're polite, not buying.
  • "Send me an email when it's ready." The "I'll think about it" of customer interviews. Almost never converts to actual usage.

The interview structure

20-30 minutes. Three sections.

Opening (3 minutes)

Frame it: "I'm researching how teams handle X. Not selling anything yet — I just want to understand your workflow. Mind if I ask a few questions about how you work today?"

Disarms them. They'll be more honest because they're not in a sales conversation.

Discovery (15-20 minutes)

Ask the three questions above. Listen 80% of the time. Probe gently: "What did you do next? How long did that take? How often does this happen?"

When they describe a pain, don't jump to your solution. Stay in their world. Ask: "What does that cost you?" or "How would you describe this problem if you were trying to get budget for it?"

Bridge (3-5 minutes)

End with: "I'm building something in this space — would you be interested in being an early test user?" If they say yes, schedule a follow-up specifically for the demo. Different conversation.

Synthesis — where the value is

The interviews are 30% of the value. The synthesis is 70%.

After every 5-10 interviews:

  • What problem language did they use? (Verbatim quotes are gold.)
  • What workarounds did they describe?
  • What did they say they'd pay for? Or have they already paid for?
  • Who did they describe as having the same problem worse than they do?
  • Where did the conversation surprise me?

The patterns become visible at 10 interviews. By 30 you usually have a clear ICP, a sharper problem statement, and a much better idea of what to build first.

Common interview mistakes

  • Pitching during discovery. Switches the conversation from honest to polite.
  • Talking more than 30%. Founder enthusiasm dominates and biases the answers. Listen.
  • Treating positive enthusiasm as data. Compliments are not commitments. Treat encouragement as noise unless it comes with a specific past behavior.
  • Skipping the synthesis. 30 interviews without synthesis is 30 interviews wasted.
  • Confirmation bias. Reading interviews for evidence your idea is good. Read them for evidence your idea is wrong — you'll learn faster.

Pair this guide with our PMF guide for what to do with the patterns you find. The interviews tell you what to build; PMF tells you when you've built it.

FAQ

How many customer interviews should I do?+
30-50 in a focused 2-week sprint when validating a hypothesis. Then 5-10 a week ongoing throughout the company's life. Founders who do fewer than 5/week lose touch with reality.
Should I demo my product during a discovery interview?+
No, not in the first interview. Demoing biases the conversation toward your solution. Save the demo for a second meeting once you've heard the actual problem in their words.
How do I find people to interview pre-launch?+
Cold outreach to 50-100 people in your target ICP via LinkedIn or email. Offer no incentive — people who'll talk to a stranger founder for 20 minutes are exactly the right pool. 5-10% reply rate is normal.
Do I need to record interviews?+
Yes, with consent. Otter, Fathom, or Granola transcribe automatically. The transcript is more useful than your live notes 2 weeks later when patterns start to emerge.
Should I share interview findings with my team?+
Always. Weekly synthesis with at least your co-founder. Patterns are visible to two people in ways they're not to one. Founders who hoard customer insights make worse decisions.