"Show Me the Data" Is a Stall Tactic
Youve just presented a recommendation based on user interviews and heuristic analysis. The engineering lead says: "Interesting, but do you have quantitative data?" You dont — because the kind of data theyre asking for would require a six\-week experiment youll never get approved. This issue gives you the language to name the pattern and the playbook to respond.

You have just presented a recommendation based on six user interviews and a heuristic walkthrough of the current flow. You showed the friction points. You mapped the drop-off. You proposed a clear direction.
The engineering lead leans back and says: "Interesting, but do you have quantitative data?"
You do not. Because the kind of data he is asking for would require a six-week instrumented experiment with a control group and a statistically significant sample. You both know that experiment will never get approved. There is no budget for it. There is no time for it. And even if there were, the PM already needs a decision by Friday.
The question was not a request for data. It was a move. And if you do not recognize it as one, you will spend the next two years believing your research is never good enough.
The asymmetric evidence standard
Jon Haidt, in his work on technology regulation, identified the exact structure of this rhetorical pattern. Companies defending their products use a specific deflection: they demand product-specific causal proof in a domain where that proof is structurally impossible to obtain. The evidence threshold is set so high that nothing can clear it. Not because the evidence does not exist, but because the kind of evidence being demanded cannot be produced within the constraints anyone is willing to accept.
The same move happens in product organizations every week.
A designer presents qualitative evidence — user interviews, heuristic analysis, behavioral patterns, support ticket clusters. The evidence is real. It points in a clear direction. But it is not a number. And someone in the room exploits that gap by requesting a number that would require resources nobody is going to allocate.
The request sounds reasonable. It sounds like rigor. It is not rigor. Rigor would be evaluating the evidence that does exist on its own terms. What is actually happening is a status quo defense disguised as scientific methodology. The person asking the question does not want better data. They want to not change anything.
This is important to understand because it changes what you do next. If you treat the request as genuine, you go away and try to produce quantitative data you cannot produce. You come back weeks later with a weaker version of the same argument, and the cycle repeats. If you recognize it as a move, you respond differently.
Three responses that reframe the conversation
Name the evidence standard directly. "We have six interviews pointing in the same direction and a heuristic analysis that confirms the pattern. That is the evidence we have. What would it take to act on this level of evidence? And what is the cost of waiting for a higher level?"
This forces the room to articulate the actual decision threshold instead of leaving it implied. Most of the time, nobody has thought about what level of evidence would be sufficient. The question exposes that gap.
Flip the burden. "We have qualitative evidence of a problem. Do we have any evidence that the current approach is working?"
In most cases, the answer is no. The current approach is not defended by data either. It is simply the default. By asking for evidence of the status quo, you reveal the asymmetry: new directions are held to a standard that the existing direction was never required to meet.
Scope the cost of inaction. "If this pattern is real — and six out of six users hit the same friction point — what does it cost us per quarter to leave it in place?"
This translates the qualitative finding into the language the room already speaks. You are not asking them to trust your interviews. You are asking them to estimate a business risk. The conversation shifts from "is this real?" to "can we afford to ignore it?"
One move
Before your next recommendation, write one sentence that names the evidence standard you are working within. "Based on six user interviews and a heuristic analysis of the current flow, I am recommending..." Then ask: "What level of evidence would this team need to act?"
Ask the question before someone else uses it against you.

