top of page

The Survey Trap: How Poor Question Design Kills Good Strategy

  • Jan 22
  • 3 min read

Updated: Mar 2


A flawed survey is worse than no survey. It gives you a dangerous illusion of data-driven decision-making while steering you toward bad conclusions. After years translating consumer insights into strategy in my past organizations, I've seen countless companies invest heavily in research only to get useless results because they asked the wrong questions in the wrong way (Or, missing out asking the right questions)


The problem isn't lack of data. During my time at GfK Asia managing market insights for consumer electronics clients, I learned that questionnaire construction is both art and science. A single leading question, a poorly ordered sequence or an incomplete scale can systematically skew results, leading executives to make wrong conclusions and hence, making bad decisions.


Where Surveys Go Wrong


Leading questions tell respondents what to say. Simply by asking "How much do you love our premium service?" , you've already framed the response. Instead, we should ask "How would you describe your recent service experience?" Neutral framing allows truth to emerge rather than forcing confirmation of an hypothesis.


Answer options cut off the most useful responses. When we developed customer segmentation models at RWS, we initially used standard satisfaction scales. They told us people were "satisfied". This is insufficient. We reconstructed surveys to capture behavioral intent, emotional drivers and competitive alternatives, leading us to understood why high-tier members or why a certain nationality members stayed and what would make them leave.


Question order may misled responses. Ask about price before value, you prime cost sensitivity. Ask about competitors before your brand, you anchor perceptions. Sequence matters. Hence, one should consider to randomize question order in split samples to identify and eliminate sequence bias.


Constructing Prescriptive Surveys

Prescriptive insights - the kind that tell you what to do, not just what. This requires deliberate design:


1. Start with the decision, work backward to the question.

Before writing a single question, ask: "What decision will this insight inform?" For example, when redesigning a high-tier membership benefits, instead of asking "Are you satisfied with current benefits?", asked: "If you could only keep three benefits, which would they be? What's missing that would make you increase your visits?"

The first question validates the status quo. The second reveals priorities and unmet needs.


2. Ask about behaviour, not stated intentions.

People are terrible at predicting their own behavior. "Would you purchase X?" is almost worthless. A more useful line: "When did you last purchase something in this category? What triggered that decision? What nearly stopped you?"


At NUS Business School, understanding international student decision journeys required moving beyond "Would you consider this program?" and into the specifics, what information sources they consulted and what eliminated certain schools from consideration early on. That shift produced findings we could actually act on.


3. Build in competitive context.

Asking about your brand in isolation creates inflated scores. Hence, do always include competitive sets such as: "Which brands did you consider? Why did you choose this one? What would make you switch?"

This doesn't just measure preference as it also reveals your actual competitive weaknesses and strengths.


4. Segment responses from the start.

Generic averages hide strategic opportunities. Designed surveys to capture lifestyle, behavioral and attitudinal variables that enabled clustering. This allows us to move from "customers want better service" (useless) to "Segment A values speed and convenience; Segment B values recognition and personalization" (actionable).


5. Test for bias.

Split samples with reworded questions. Check for order effects. Validate stated preferences against actual behavior. Routinely ran cognitive pretests; watching people complete surveys while thinking aloud, revealed where questions confused, led or missed the point entirely.


From Data to Decision

Even a well-constructed survey fails if the analysis treats it as a standalone report. The findings need to connect directly to the decisions they were designed to inform whether that's lifecycle strategy, partnership priorities, product development, or operational changes.


The goal isn't perfect data but getting the truth.

Well-constructed surveys reveal uncomfortable realities, challenge assumptions, and point toward specific actions. Badly constructed surveys tell you what you want to hear while your market position quietly erodes.


Survey design isn't a research task but it is a strategic discipline.

The quality of your questions determines the quality of your insights, which determines the quality of your decisions.


The quality of your questions determines the quality of your insights, which determines the quality of your decisions.

 
 
 

Comments


image_edited.jpg

Hi,I'm Mui Kim

Outcomes, not just slides. 
Let's discuss on the most critical business challenges and put that into clear, deliverable project scopes.

Post Archive 

Tags

Get Great Business Tips to Your Mailbox. Subscribe.

© 2026  FiveNinetyFive  All rights reserved.

bottom of page