Elements of a good research question, common mistakes and how researchers can avoid them

Adapted from and in collaboration with Happy Market Research podcast Episode 301

As researchers, we are always trying to better our craft. To do this we attend seminars, conferences, webinars and listen to podcasts etc. One of the most effective ways to learn is to simply sit down with a peer and have a conversation. So, that’s exactly what we did. 

In this post, we hear from professionals leading user experience and market research at some innovative companies, like Shopify. They’ve given us their top tips for asking good research questions to participants as well as how to avoid common mistakes.  

If you are involved in consumer insights from either a practitioner or buyers perspective, this episode is for you. 

Before we start with the top tips, let’s look at some of the most common mistakes researchers make when asking a research participant a question.

Mistake 1: Leading questions

Leading questions came up a lot. Here is Josh LaMar, a well known User Experience Researcher. LaMar said,

“I think that the biggest mistake is to either ask a leading question or to frame it too narrowly first. We’ve talked about framing narrowly first, so I guess we could talk about leading questions now. Which are things like, tell me how amazing this product is. That’s an over-exaggeration, but it can be much more subtle too. Like, if you’re only asking about the positive aspects of something or you’re saying, oh, this is a really great feature, isn’t it? Well, what did you just do there? You told the user-you primed them, number one, to say, I like this feature, and then I created this tag question like, isn’t it? Don’t you agree with me? You should agree with me because I’m the smart one here. You just made the user feel dumb, and then you also told them exactly what you want to hear. So what are they gonna do? They’re gonna tell you what you want to hear because they want to make you happy. And it’s so important as a researcher, to be very neutral and to ensure that you’re not letting too much of your own feelings ever come out. Because as soon as you start letting on like, this is really dumb, isn’t it? Yeah, I don’t really use this, but we need to test this for our client. Can you just tell us that thing? You’re throwing out the whole study data if you do that because it’s too leading, you don’t want to lead them on to the answer. The answer is what they think, not what you think.”

For kicks, take this two-minute excerpt from Yes Minister, an English TV Series originally airing in the early 1980s, on leading questions. This is a comical bit that serves as a clear example of the importance of being thoughtful about your study design and the questions you ask.

Mistake 2: Double-barreled questions

Similar to leading questions are double-barreled questions. Zoe Dowling, SVP at FocusVision talked about these. As described in Wikipedia, “This is an informal fallacy. It is committed when someone asks a question that touches upon more than one issue, yet allows only for one answer. This may result in inaccuracies in the attitudes being measured for the question, as the respondent can answer only one of the two questions, and cannot indicate which one is being answered.” Dowling said,

“Double barreled questions. How can you really answer that? You’re leading me into – it’s the basics. It’s you’re leading me into this response. I can’t respond to it the other way. We’re all the time- it’s like I can’t respond to that at all. I- none of those apply, and we don’t give any- we don’t give- we’re constructing these questions to allow, and this is actually more on the quantitative side because at least on the qualitative side, people ask- you get to some sort of response, whether it’s what you want or not. People will give their opinion because it’s open ended. Whereas in a closed ended survey question, you’re dictating the whole frame of it. The question you’re asking and the responses they get. And it’s like no, that doesn’t apply to me. You’re not getting to my opinion, and I think those are some of the things you see frequently and we’re all guilty of it because you, the person that’s designing the instrument, you’re bound by your own parameters and how you’re viewing it and how you’re framing it.”

Now, let’s move on to some tips…

Tip 1: Keep it conversational

In addition to double barreled questions, Dowling outlines the need for us to talk in conversational, human, terms. She said,

“I think the fundamentals remain the same, whether you’re asking a question in a survey or constructing it for an interview. I mean obviously there’s some fundamental differences. If you think about- the first thing are you gonna be understand? Talk in everyday language. I think too often we want to frame- we either bring in the world that we’re in, be it the actual industry. We’ve got particular language jargon that we’re using. Or you might think that I need to be so incredibly specific that you end up creating this very convoluted, the way it’s constructed question that is anybody gonna- you’ve just said it. We read in headlines. So do our participants. They scan. In fact, very often in a survey, they actually just go straight to the answers to determine that the question was and how they’re going to respond. So it’s been clear. It’s been concise. And I think that kind of works for both sides, qualitative or quantitative. Because if we’re qualitative, you’re gonna take the question and you can probe. You can go deeper and you’re gonna take it all from there. But if you start with something that’s very convoluted. Then, well, you’re probably not gonna get to where you really wanted to go in the first place. That would be my overarching thought. We sometimes over-engineer our questions.”

Keeping things simple can be the best way to connect the intent of your question to the participants. Here is an example, if we want to know how much people like my new electronic coffee mug that keeps the liquid hot, we could ask,

“Most coffee makers produce a cup of coffee that is 170 degrees. After you pour the coffee into a mug or other preferred drink container of choice, how does the change in temperature of that coffee’s life cycle impact your enjoyment level?” 

Versus, 

“Think about your last cup of coffee. How did it’s cooling impact your experience?”

Part of the issue here is surveys and discussion guides are often written by a committee and as my late grandfather used to say, a camel is a horse designed by a committee.

Tip 2: Use common terms

Okay, let’s look at some more tips to frame a good question. 

In full disclosure, I made a big mistake in my first interview with Emma Craig of Shopify. Here was the question I asked her, 

“Key elements of a good question?”

The way I framed the question was ambiguous. This created confusion. She didn’t know what we were talking about. Was the elements specific to research objectives, or a question in a survey, or a question in focus group, or a user experience study. 

It is so easy to assume your participants are starting with the same framework as you are. Nomenclature, colloquialisms, phraseology, mindset… these are just some of the things we need to think about.

Tip 3: Know why you are asking each question

So, the updated question we asked was, “What are the key elements of a good participant question?” 

Having only been in research for a year, I found this episode really useful, especially when listening to Harry Brignull’s interview. For those that don’t know, Brignull is the UX specialist who first coined the term “dark patterns.” Dark patterns are tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something.

Brignull mentioned that in order to ask the right questions, you have to ask yourself why you’re having this study and what your goal is. After you answer that question, you can move on to creating a good question for participants.

Tip 4: Start broad then go narrow

Another point Brignull made was about starting diagnostic questions broad and then narrowing things down. Brignull said,

“I think it’s very easy to focus on the small details of the research. And researchers can feel very safe when they focus on small things, like the recruitment, specification, exact wording of the questions. But in my opinion, what defines good research and then it sort of cascades into the questions is the overarching research objectives. So what are you doing the research for in the first place? And if you don’t get that right, the questions are inconsequential. And if you do get it right, the questions become much easier to write anyway. So what do I mean by that? Basically it’s very common, particularly when you got a new job or if you’re a junior researcher to have someone come along and for example, a product manager or product director or someone in management try and tell you the objectives in advance of what you should be doing your research on. And managers tend to be very feature-focused, so they’re probably going to be very specific and have a very narrow brief about the one thing that they care about at that point in time. So for example, imagine you’re a researcher and you’ve got a new job and the team you’re joining has never done any user research. And your manager, or product owner or whatever comes along and says, “I want you to do some research on this particular dashboard that we’re building for… This dashboard is used by this one particular user type.” Let’s say you’ve got six user types and it’s used by one of them. So if you go and do that research, you’ll probably make that person happy. But you’ll still be kind of in the dark about the big picture. So what about the other five user types that we talked about there? What about the broader user needs? What were the most worrying or the least understood things about the problems that your product is trying to solve for users? And besides, often these sort of senior manage-y type people, they don’t really know what good user research is anyway. So really, like I was saying earlier is really a lot of the job of the researcher is to teach the people around them how they can be engaged with in a constructive way so they don’t get approached with very tightly defined research questions that are overly scoped basically. So I’ve got a metaphor here. If you think of your problem space as being like a dark cave. Using research is a bit like a flashlight that shines a beam into the cave so you can see what’s going on. The first time, if you did go climbing or go exploring and find a big dark cave. The first thing you’re going to want to do is shine your torch, shine your flashlight around the cave to try to work out what’s in there. You’ll probably do it quite quickly right just to make sure that you’re safe and there’s no big surprises like a bear or something. And then once you’ve done that, then you might have a more focused beam and shine it at something else. You might feel like okay we’ve covered all that, we’ve done our first pass. Now we can focus on that really exciting structure over there, the stalagmites and stalactites or something like that that you really point with being there and get really interested and focused on it. So I guess a bit of a tenuous metaphor there, but I think it’s really, really important to always start broad. Otherwise you can end up getting really deep into something and missing the points on how. Because human life is multilayered and it’s always good to start out with the broadest possible way and then zoom in gradually rather than zoom in first and kind of miss out on some big thing that you should be working on.”

Similarly, Josh LaMar, the previous research manager for Outlook said,

“I think that the way that you frame a question is very, very important because you have to be at the right level. And what I mean by level is that, if you start off an interview by saying like, well, tell me how you check your email on the weekends, you’ve just scoped it so narrow and really, you might be interested in something else. I was the research manager at Outlook for several years, so I can use email as a really easy example of things that I’ve done research on a lot. So it’s really important to start very broad and then move into the specific. And an example of a broad question might be, tell me how you communicate with your friends and family, much more broad than just email. And then as you start getting into it, you’ll find more interesting things. The framing is so important because when you frame too narrowly, you put this box around the user. And the user thinks, I think that they want to hear just this part, and so they only share the things that are in that box. But when you add a broader box from the beginning, then everything else is open. And you might find something that’s even more interesting just by asking a broader question.”

Brignull’s framework of a cave is exactly how we should think about our research. When writing your discussion guide or survey, start with your assumptions and then get rid of them. The bigger your initial void the more you’ll understand the context of the participant and their opinions about your research topic. 

In line with starting broad and then narrowing in on your research question. I loved the tactical example of how Emma Craig, of Shopify, breaks this point down. Craig said,

“I think good interview questions or these direct questions that you’re asking a participant or a respondent start with your bigger question, your research question. And I don’t want it to get confusing here of what’s what. But before you can start to formulate your discussion guide and understand exactly what it is you want to ask these people when you’re face to face with them, you have to have your research question and your research objective in mind. So, the research question here is, essentially, seeking to understand why something is happening. Or what is happening? You’re looking to uncover a process, or a need, or a challenge that someone is experiencing. So, an example would be, “What are the biggest challenges people experience when it comes to taking public transit?” And that would be your research question from which you derive all of your interview questions. And you had a really good point about not asking these pointed, direct questions that you just directly ask because, half the time, people won’t actually know the answer or they won’t have the answer. But I have learned over the years that if you ask somebody a question, they will answer your question. So, whether they make it up, or they exaggerate, or whatever it might be. If you ask them something directly, they’ll give you a direct answer. And you can’t always be certain but that is true or that they are not just telling you what they think you want to hear. So, your interview question, it’s there for you to collect evidence and you have to take different angles. You have to go sideways or, like you said, take the backdoor. If your research question is around the biggest challenges people experience when it comes to taking public transit, your interview question shouldn’t be just asking somebody if they like to take the bus, your interview question could be asking them to walk you through how they got to work last week. And to kind of take these roundabout ways to understand the environment that it is you’re researching.”

The context of the participant when consuming or experiencing the thing you are measuring is 100% vital. In the way a wheel doesn’t matter without a car, we have to put our participants in the context of their consumption and then drill down.

This is much harder than simply reducing your research to an Net Promoter Score or similar Likert scale which is far easier but less effective at uncovering hidden truths.

Tip 5: Protect your participants 

Let’s chat about the importance of protecting your participants from your customer or internal stakeholders. Brignell said,

“I remember once doing some research and you have the stakeholders in the room. And one of the stakeholders would wrap his fingers on the table like this when the user didn’t answer the question. Yeah, we were doing some research on time tracking companies in Munich. Because the tech was like a stumbling piece of tech that you kind of had to be in the room to see working. So that didn’t go. Basically you need to keep the stakeholders far away sometimes. And I often find that, I know some researchers like to have a chat window open and like to let some people ask some questions during the research. I absolutely will not abide that as the researcher they can all get lost. They can write notes and stuff and I’ll talk to them afterwards. But having that extra channel of input while you’re trying to run an interview, it’s just mind meltingly annoying.”

While Brignull’s example sounds like it came straight out of an episode of The Office, this is a real issue. Luckily, tools like Lookback provide virtual observation rooms where the internal chat is protected from the participant, and the researcher can focus in one window.

Tip 6: Leverage past behaviors to inform future usage

The last tip we are going to cover today is around the importance of leveraging past behavior to inform future outcomes. Craig said,

“So, a really simple example would be, “How often do you picture yourself using this?” Maybe in the interview, you have exposed that this is something they are interested in, and they think it would be very helpful. It would ease all of these pains and challenges that they have. And then you want to say, “OK, well, how often do you think you would use it?” But people cannot give you a realistic idea about the future; they don’t know, they will make it up. Like I said, if you ask somebody a question, they will answer that question. But it probably won’t be true because they don’t know well enough if they’ll use something or if they’ll do something in the future. I think an example I use a lot is if somebody asked me what I was going to eat for lunch tomorrow, I can’t actually- I can give them a guess but I can’t actually tell them. But if they asked me what I have eaten for lunch every day this past week, I’ll give them a much better indication of what I might eat for lunch tomorrow or what my lunches look like. So, yeah, probably that, asking people to predict instead of basing the question in past behavior.”

Using prediction-based questions versus asking about prior usage or experience can help inform your models for the future and projection, which is usually the intended and less-biased metric.

Tip 7: Run a pilot before your study

The last tip that came up is from way back when. Prior to the internet, recruiting people for research was hard and expensive. To avoid a project getting to an irreparable state, we’d often run a pilot project. This took a bit of time and took about 10% of the total budget, but it served to ensure that we were asking the right questions in the right way to answer our research objectives. 

We hope you found value in this post and, if you did, please take a moment to share it on social. It’ll help other research professionals elevate their craft. 

Happy Researching!

Please find the references for this post here.