If you want the best data from your surveys, you have to know how to write effective questions.
While simple enough in theory, developing relevant survey questions takes some intention and a bit of practice. But that doesn't mean it has to be complicated.
Today, we’ll share some simple yet powerful tips from our in-house research team to help elevate your next survey.
Before we go any further, let's take a step back and investigate what a "good" survey question really is.
Definitionally, a good survey question is:
Clear and direct in meaning.
Considerate of user experience.
Free from bias.
Focused on achieving the goal of the survey.
If that sounds incredibly simple, you’re not wrong! But, it does require some forethought.
By focusing your energy on writing clear, unbiased questions that point back toward your objective, you can set yourself up for high-quality insights.
Here are some guidelines to consider when drafting your next survey:
Before you even begin a rough draft, you need to know the purpose of your project, the objectives you want to achieve, and the topics you'll need to cover.
A great way to do this is by asking yourself (or your team) some basic questions, like; how will this data be used? What metrics matter most to us? And what decisions will be made based on the insights we've found?
Nailing down these details early on will ensure your questions are relevant and the feedback you collect is reliable.
Unless your sample is highly targeted at a specific group (think Ph.D. students or IT decision-makers) you need to keep your language simple and free of jargon, undefined acronyms, or buzzwords.
A general best practice is to write questions at your audience's level of understanding. So, if the sample for your survey is the general public, you’d want to write for a 6th-grade level of comprehension. That means keeping your vocabulary and sentence structure simple and straightforward.
We know, we know- this one is much easier said than done. But research has shown that long, monotonous surveys only lead to respondent fatigue and poor-quality data.
Because surveys are highly personalized to the use case they serve, it's impossible to give a precise question limit.
Instead, we like to think about it in terms of time. Try to keep your surveys no more than 7-10 minutes long to avoid respondent fatigue.
And if your survey has to be longer, consider providing a good incentive- they can be especially useful in these situations!
If you have any screening questions that may terminate or disqualify a respondent, you’ll want to ask them early in your survey.
For example, let's say you're targeting consumers who have used products from your competitors. You would need to include a question at or near the start of your survey asking which products respondents have tried.
This ensures that no one's time is wasted, and ensures you can more quickly fill your survey with qualified respondents.
It's impossible to list every potential answer to a question.
That's why you should always provide alternative answer options for multiple choice questions, like; "none of the above" and "other- please specify."
Including these additional options gives respondents an out in situations where they would otherwise be compelled to select a choice that didn’t reflect their feelings.
Multiple-choice questions aren’t the only (or best) way to collect feedback!
Adding rank order, matrix, rating scale, and even heatmap questions can make your survey more engaging and provide interesting insights. A win-win for everyone.
Plus, rating scale/slider questions work seamlessly with SightX’s automated persona tool- which uses Machine Learning to uncover buyer personas you might be missing. If you’re interested in this topic- check out the blog Consumer Segmentation: Maybe You Could Be Doing it Better.
Text analysis can give you some honest and intriguing insights. However, open-response questions come with drawbacks.
Specifically- the toll they take on respondents.
Not only do people get tired of typing out long answers, but this question type can often carry a heavier mental load than simply selecting, rating, or ranking answer options.
So, use open-ended questions strategically and only when needed.
We’ve said it before, and we’ll probably say it again: do NOT use even-numbered scales. Instead, stick to a 3, 5, or 7-point scale.
Why? Because even-numbered scales do not leave room for a neutral option. And without a neutral option, truly neutral respondents can be forced to select a rating that doesn't reflect their feelings.
You also need to make sure your scales are consistent.
If the first scale question you ask begins at “Very Negative” and ends with “Very Positive” you will want to keep this structure throughout your entire survey.
Sometimes people’s attention can stray from the task at hand. Unfortunately, that isn’t ideal if you're looking for high-quality survey data.
To combat this, add a simple attention-checking question in the middle of your survey. They can look something like the example below:
Short surveys will only need one. But, if your survey is on the longer side you may consider adding multiple.
This one might seem simple, but we’ve included it for a reason.
Before you send your survey off to thousands of people, give it a try yourself. You may catch spelling, grammar, or survey flow issues you would have otherwise missed. So never overlook the power of a good test.
Now, for the other side of the coin.
Leading questions steer respondents in a specific direction with an implication that there is a correct answer.
For example, if you wanted to get some metrics on your customer satisfaction you might include a question like “How satisfied are you with our product?”
However, this phrasing assumes the customer was satisfied.
Instead, you might want to rephrase the question to: “Please rate your experience below, with 1 being “not at all satisfied” and 5 being “very satisfied."
When writing your survey questions, the word “and” isn’t your friend.
For the most reliable data, you want to only ask about one metric at a time.
For example, if you were to ask a respondent “How would you rate the performance and value of our product?” you’d be putting them in a tricky situation. Maybe they found the performance stellar but felt differently about the value.
Instead, you would want to separate the performance and value metrics into two questions to properly evaluate each.
If you’re using any demographic questions as screeners, you’ll need to include those upfront. However, the rest you can split between the beginning and end of your survey. That way you can still collect the data you need without requesting too much personal information up-front.
The SightX platform is the next generation of market research tools: a single, unified solution for consumer engagement, understanding, advanced analysis, and reporting. It allows insights, marketing, and CX teams to start, optimize, and scale their insights workflow.
But, SightX isn’t just great tech. Our Research Services team knows all of the best practices, along with some tips and tricks for getting the best data out of your surveys.
Remove the guesswork from your current strategy by going directly to the source. Start a free trial today!
Meet the next generation of consumer insights