Survey Writing Assistance

icon picker
How to create a survey

When you have a question about your business—which new product to offer, what new features and services to add, where to open your next location—there's only one obvious way to find out: a survey. It's like magic: you put together questions, ask your followers and fans, and voilà, you know what to do next.
Surveys seem so simple, but they rarely are in real life. Slightly vary the types of questions and response options in your survey, and you can seriously impact the quality and value of your survey’s results.
Bad results can lead to bad decisions—the very thing you set out to avoid by making a survey in the first place. Ask the wrong questions, or ask them in the wrong way, and you'll end up with products and services no one wants.
That's why thoughtful survey design is so important. It'll help you get better, trustworthy results. Here are some tips on how to build an effective survey, starting with the one thing that's most important in a survey: questions.

Surveys: Where to Begin

It's easy to begin the survey writing process by brainstorming a list of questions to ask. Your head's full of questions you're dying to ask your customers, and it'd be so easy to type them out in a survey app and call it a day.
But that's far from the best way to start. Instead, you should begin your survey building process by brainstorming the answers you want. You want actionable feedback, and you'll be most likely to get that by thinking through the exact answer you want. This could be a simple answer (perhaps "Our customers want us to offer THIS flavor of soda") or a more complicated hypothesis you want to prove (such as "Concern about social status is/is not correlated with social media usage.").
So sit down, and think through what you want to learn from your survey. Write down each answer you want, with a blank in the spot of the thing you want to learn—the flavor of soda to offer, the feature people are missing, or the correctness of a statement. Once you've completed this exercise, use the list to build questions for your survey.
Starting with a list of answers and turning them into survey questions will ensure you include all of the questions you need, and word them in a way that will get effective answers. It will also prevent you from inflating your survey with questions that don’t matter.
Just like you start a building project with blueprints—and don't just begin pouring concrete whenever you decide you want a new building—your survey should start with the answers you need, and then you'll be better prepared to make the questions that will provide those questions.

Getting Answers: Survey Question Types

As you're turning your answers into questions, you'll need to think about what type of questions you need to ask. Surveys aren't just about yes and no questions—you'll find dozens of question types in most survey apps. It can quickly get confusing which type of question you should use for each answer you need.
The type of question you use will affect the answers you get and the kinds of analysis you can do. Here are the most common types of questions you can use in a survey, along with examples of the type of data you'll collect with each.

Categorical Questions

If you’re looking for a simple count, like “35% of people said ABC” or “20% of men and 24% of women…” then there's a variety of question types you can use: Yes/No, checkbox, or multiple choice question type. These types of questions are also called “nominal” questions.
Analysis of categorical-level questions can include counts and percentages—"22 respondents" or "18% of customers", for example—and they work great for bar graphs and pie charts. You cannot take averages or test correlations with nominal-level data.

YES/NO

The simplest survey question—and the only question you'll usually use in a poll—is a Yes/No question. You'll ask a question, then have two options: Yes and No. Your survey app likely offers a Yes/No question; otherwise, use the multiple choice question and add Yes and No answers yourself.
Example: Are you a vegetarian? Yes/No

MULTIPLE CHOICE

Need more nuance than a Yes/No answer gives? Multiple choice is what you need. You can add as many answers as you want, and your respondents can pick only one answer to the question.
Example: What's your favorite food? Pizza/Pasta/Salad/Steak/Soup/Other

Ordinal Questions

When question responses have a clear order (like “Income of $0-$25K, $26K-40K, $40K+”), we call them “ordinal” questions. You could collect ordinal data with Multiple Choice questions, or you could use drop-down or ranking questions.
Analysis for ordinal questions is similar to analysis for nominal questions: you can get counts and percentages. You cannot find averages or test correlations with ordinal-level data.

Interval/Ratio Questions

For the most precise data and thorough analysis, use the interval or ratio question type. These questions allow you to conduct advanced analysis, like finding averages, testing correlations, and running regression models. You'll use ranking scale, matrix, or text fields in your survey app to ask these type of questions.
Interval questions are questions that are often asked on a scale of 1-5 or 1-7, like from “Strongly Disagree” to “Strongly Agree” or from “Never” to “Always.” Ratio questions have a true zero and often ask people to input an actual number into the survey field (like How Many Cups of Coffee Do You Drink Per Day? ____”) You don’t really have to worry about the differences between the two types.

RANKING SCALE

The default choice for interval questions, ranking scale questions look like a multiple choice question with the answers in a horizontal line instead of a list. There will likely be 3 to 10 answers, either with a number scale, a like/love scale, a never/always scale, or any other ratio interval. It's a great way to find a more precise measure of people's thoughts than a Yes/No question could give.
Example: On a scale of 1-5, how would you rate our store cleanliness? 1/2/3/4/5

TEXTBOX

For ratio questions—or direct feedback, or personal data like names—you'll need the textbox question. There's usually a small and large textbox option, so choose the size that's appropriate for the data you're collecting. You'll add the question, and then there will be a blank where your respondent can enter their answer on their own.
Example: How many apps are installed on your phone? Enter a number: _____

6 Best Practices for Writing Survey Questions

Now that you have a list of the answers you're looking for, it's time to start writing questions for your survey. You know you want to pick a new flavor of soda to offer, so you immediately start typing:
"We know you love ABC Soda, and so we want to make a new drink you'll love. We've thought about offering Watermelon or Potato Chip flavored soda, but wanted to ask you first. So, which flavor of soda would you like to see us offer, and what size of bottles would you like to buy it in?"
Nope. That's not going to work. Before you write pages full of detailed questions, you'll need to remember to follow these tips to build effective survey questions:

1. Use Simple, Direct Language

Avoid using big words, complicated words, and words that could have multiple meanings. Your question should be short, simple, and clear.

2. Be Specific

Some concepts may mean different things to different people. Try to be as specific as possible when you ask questions. For example, instead of asking “Do you exercise regularly?” you could ask “How many days per week, on average, do you exercise?” This gives you a more precise, objective answer.

3. Break Down Big Ideas into Multiple Questions

Another way to deal with broad concepts that mean different things to different people is by breaking them down into multiple, more tangible questions.
“Customer satisfaction” is a common topic that businesses want to explore, and it's a big question packed with smaller ideas. Instead of asking “How Satisfied Are You with This Product?”, you could instead ask people to give their opinion on three separate statements (asking them to weigh in on a scale of “Strongly Disagree” to “Strongly Agree”):
I enjoy using this product.
This product meets my needs.
I would purchase from this company again.
Break down big concepts into separate questions.
The individual statements provide insight into different pieces of your business, and the average of the scores give you a general measure of satisfaction that you can track over time and try to improve. Together, the three questions give you a precise, actionable answer to the question of customer satisfaction.
For another simple way to survey customer satisfaction, check out our .
Marketing scholars have developed many series of questions (called “scales”) that together measure broad concepts (known as “constructs” in the research field) like customer satisfaction. One of the best collections of these marketing scales is , a must have for any serious market researcher with questions for constructs like customer satisfaction, brand affinity, and more. Or, for a simpler option, you'll find pre-made constructs in many survey apps such as Survey Monkey's . Rather than try to come up with your own questions, you can use these questions that have already been determined to be statistically valid.

4. Avoid Leading Questions

Sometimes, researchers’ opinions can seep into survey questions, subtly encouraging respondents to answer in a certain way and compromising survey results.
For example, asking “Do you think the school should cut the gym budget to pay crossing guards?” would likely prompt a different answer than asking, “Should the school employ crossing guards to protect our children?” even though both questions are related to the same topic.
To avoid leading questions, ask a friend or colleague to review your survey for any questions that seem like they have a right or wrong answer. If your friend can guess what kind of answer you’re looking for, consider rewriting the question. The answer may even be in splitting the question into multiple questions—a great option for the example question.

5. Ask One Thing per Question

Each of your survey questions should ask one thing, and one thing only. It seems simple enough, but many survey writers fall into the “double-barreled” question trap. For example, “Do you eat fruits & veggies on a daily basis?” can actually be a hard question to answer. What if somebody eats just fruits or just veggies? There’s not a clear way for them to answer this question. A better option is to split the question into two separate ones.
You can check your survey for double-barreled questions by looking for words like “and” or “or” in your questions.

6. Use More Interval Questions

One simple way to take your survey from good to great is by changing your Yes/No and multiple choice questions to interval questions. Make a statement, and ask people to answer it on a 1-5 or 1-7 scale, like “Strongly Disagree, Disagree, Neither Agree nor Disagree, Agree, or Strongly Agree.” You’ll instantly upgrade the level of analysis you can perform on your results.
Researchers use scales of 1-5 or 1-7 because they do a good job capturing variation in answers, without causing information overload for the respondent. It may seem like using a scale of 1-100 would help you capture really detailed answers, but it actually causes respondents to answer 0, 50, or 100—their answers tend to migrate around extremes or the center. Using a scale of 1-5 or 1-7 will help you get more accurate, nuanced answers from respondents.
Then, instead of looking at each question individually, like most people do, you can add on another layer of analysis by looking at how questions relate to one another. When you ask interval questions, you open the door to check , which allow you to say “People who are more likely to ABC are less likely to think DEF.” If you’re a statistics whiz, you can run a and say “Factors G, H, and I have the biggest impact on J.” More simply, you can take averages and say things like “Junior employees, on average, exercise more often than senior employees.”

Survey Pitfalls to Avoid

Once you’ve written your survey questions and responses, it's time to make sure you haven’t fallen victim to the following pitfalls.

Bias

Survey response bias is a sad but important reality to consider when writing surveys. Asking for information like gender, race, or income at the beginning of a survey can influence how people respond to the rest of the survey. (This is also called .) For example, studies have shown that when girls identify their gender before a math test, they perform worse on the test than girls who aren’t asked to identify their gender.
Most survey writers prevent bias and stereotype threat by asking sensitive questions—including those about gender, race, and income—at the end of surveys.
Bias can happen on a smaller scale, too. Asking someone “How important do you think content marketing is?” followed by “How much do you plan to invest in content marketing next year?” could lead to bias. If someone says they believe content marketing is very important, they may inflate the dollar amount they plan to spend in the next question. Randomizing question order is a simple way to prevent this type of bias.
Bias can also happen when you interpret the survey. Without even knowing it, you might treat one person's opinion differently simply because of their demographic answers. In some cases, you might not want to gather any demographic data at all to create a totally anonymous survey, something common in academic research.

Framing

The wording used in survey instructions about why a survey is being conducted can impact the way respondents answer questions. For example, framing a customer service follow-up survey as an evaluation of a team member may prompt respondents to be more positive than if you framed the survey as a tool to improve your processes.
Asking “Did John solve your problem well?” may encourage more positive answers since respondents are being asked about a specific, real person. But you want high quality answers, so asking “Did we solve your problem today?” is a more neutral way to phrase the same question.
People have a tendency to want to help. If you tell them that the survey has a goal, they may answer questions in a way that helps you achieve that goal, instead of answering the questions totally honestly. To prevent this, try to be neutral when you describe the survey and give instructions.

Incomplete Options

When questions are asked on a scale (like from “Strongly Disagree” to “Strongly Agree”), respondents can get frustrated if there’s not a Neutral option. Neutral options are usually handled two ways: giving respondents a “Neither Agree nor Disagree” option in the middle of the scale and giving them a “N/A” option at the end of it, in case the question does not apply.
Respondents can also get frustrated with your survey when it forces them to answer questions in ways that aren’t true. For example, if you ask “What’s your favorite food?” and only provide pizza, cheeseburgers, and burritos as options, people who love chicken nuggets don’t have a clear answer choice. An easy way to get around this is to provide an “Other” option and make the question not required.
other
Always give users an 'Other' option.
An even better solution is to make sure you’re providing robust answer choices by pre-testing your survey and allowing respondents to brainstorm answer choices you may have missed. (You could also rewrite the question to not require as precise of an answer.)

Tips on Survey Format

Keep your survey as short as you can by limiting the number of questions you ask. Long surveys can lead to “.” When survey fatigue hits, respondents either quit the survey or they stop paying attention and randomly check boxes until it’s complete. Either way, your data gets compromised.
Putting together your ideal “list of answers” before you write your survey will help you make sure to only include the questions that need to be asked. Compare the questions you’ve written to that list of answers. If there are any unnecessary or extra questions, remove them from the survey.
Here are a few more tips for formatting your survey to avoid survey fatigue and get meaningful results:

Break the Survey into Multiple Pages

If your survey does get long, consider breaking it into multiple pages. Respondents will be less overwhelmed when they look at it. Be careful, though, because having too many pages can also cause survey fatigue. You’ll have to strike a balance between page length and number of pages.

Conclusion

But don't share your survey with the world. Not just yet. Instead, it’s time to pre-test it. Pre-testing will help identify unclear questions, badly-worded responses, and more before you send your survey out to your respondents, and will give you a chance to improve your survey and its chances of generating actionable feedback.
To pre-test, send your completed survey to a few different people and ask them to tell you about any questions that seemed unclear or any problems they found. If you can, sit down with at least one or two people while they take the survey and listen to their reactions and feedback as they go. You’ll often hear things like “I’m not sure how to answer this” or “Wow, this is really long,” which are clues that you have some revisions to do.
After it’s been pre-tested, take the time to review the results of your survey from your testers. This data isn't meaningful for your survey, but it can be helpful to make sure everything's working correctly. Make sure nothing has gone wonky with measurement and that you’re getting the types of data you expected.
No matter how much thought and hard work you put into creating your survey, it’s going to have flaws—and that’s okay. No survey is perfect, but investing time and thought into planning and writing will bring you much closer to getting the answers you need. And that's the ultimate goal—the survey's only a tool to get you there.

Components of an effective survey question

Generating clear and straightforward questions and formatting the survey questionnaire in a way that flows and promotes participation are key.
The following section summarizes a question-and-answer process provided by de Leuuw and colleagues [
], meant to provide tips and strategies for developing effective survey questions.

IN ORDER TO ANSWER A QUESTION, A RESPONDENT MUST:

Understand the question in the way the researcher intended the question to be understood;
Have (or be able to retrieve) the relevant information needed to answer the question
Translate that information into the appropriate format required to answer the question; and,
Be willing to share the relevant information in the most accurate answer possible (for tips on reducing bias, refer to de Leuuw et al. [
])

1. Write questions respondents can understand

Given that survey methodology consists of asking a series of standardized questions, respondents need to have the same understanding of what a question is asking. To reduce misunderstandings in questions, work to reduce ambiguity.

1.1 Use clear and simple language, avoid complex terms, acronyms, and jargon

The use of complex language and terms, as well as acronyms and jargon specific to your field, will make questions more difficult to answer.
Example:
Poor question: “In the last week, did you exercise?”
The problem: The term ‘exercise’ isn’t clearly defined and could be interpreted differently by respondents — some might consider walking to and from their car exercise, whereas others may not consider anything that doesn’t cause them to sweat or breathe heavily as exercise.
Better question: “In the last week, did you exercise or participate in any physical activity for at least 20 minutes that made you sweat and breathe hard, such as running, swimming, cycling, or similar activities?”
Another Example:
Poor question: “Have you ever been diagnosed with CHD?”
The problem: Using acronyms and technical jargon will make it difficult for people who don’t know these terms to provide reliable answers.
Better question: “Have you ever been diagnosed with coronary heart disease (CHD) — a disease in which the blood vessels that supply oxygen and blood to the heart narrow?”

1.2 Provide a time frame or reference period

Ensure respondents know the period or time frame — and therefore the context — of your questions.
Example:
Poor question: “How often are you sad?”
The problem: A lack of time frame or context in which to answer can lead respondents to interpret the question differently (i.e. answer generally or pick a random reference period).
Better question: “In the past four weeks, how often have you felt sad?” or “On average, how often do you feel sad?”
Note that this type of question would be best answered using a Likert rating scale (i.e. where 1=never and 5=all the time).

1.3 Avoid questions with an embedded assumption

Sometimes questions can contain assumptions about the respondents’ situations or how they think about things. When these assumptions are not true, the respondents may be forced to answer in a way that does not accurately represent their situation.
Example:
Poor question: “When riding in the back seat of a car, how often do you wear a seat belt?”
The problem: This assumes the respondent rides in the back seat of a car, and does not provide an option for those who never ride in the back seat (it also doesn’t provide a time frame). To answer this question, you need to first determine if the respondent has ridden in the back seat, and if so, how often they wear a seat belt.
Better question(s): 1.a) “In the past year, have you ridden in the back seat of a car?” (IF YES) → 1.b) “When riding in the back seat of a car, how often do you wear a seat belt?”

1.4 Avoid double-barreled or multiple questions

Double-barreled questions ask two things at once. To avoid ambiguity, ensure there is only one question being asked of respondents at a time.
Example:
Poor question: “How helpful were your friends and family while you were sick?”
The problem: This question is asking about the helpfulness of two seperate groups of people (1. friends and 2. family), so respondents will have to choose to either ignore part of the question or answer it as a single question and lump these groups together.
Better question(s): 1. “How helpful were your friends while you were sick?” and 2. “How helpful was your family while you were sick?”

1.5 Avoid double-negative questions

Double negatives pose the risk of misinterpretation, and thus present the risk of respondents answering questions in a way that is unintended (or even the opposite of what the question is actually asking).
Example:
Poor question: “Which of the following behaviours is not considered unacceptable?”
The problem: Double negatives can be confusing, and may result in unreliable responses — be as clear and unambiguous as possible.
Better question: “Which of the following behaviours is considered acceptable?”

2. Write questions respondents have the information they need in order to answer

A respondent can only answer questions for which they already have or can retrieve the information. There are two main reasons respondents may or may not be able to retrieve the relevant information:

2.1 Lack of Information

A researcher may unintentionally ask questions the respondent doesn’t necessarily have the information to answer, or in a format that isn’t familiar to the respondent. For instance, a question might ask “How many miles from your home is the nearest hospital?”, though this question may be answered more easily with travel time as the unit of measure, or by asking about the location of the respondents’ home and the nearest hospital, and having the researcher calculate the distance.
Additionally, people cannot accurately and reliably report on others’ emotions, so it’s best to avoid asking such questions. For example, instead of asking “How much does your mother enjoy the activities in the nursing home?”, a better alternative would be to ask about an observable behaviour, such as, “Does your mother participate in any activities in the nursing home?”

2.2 Recall problems

A respondent may not be able to recall the information needed to answer certain questions, for various reasons. There are no solutions to recall problems, though there are a few strategies that might help respondents recall information:
Make the time frame consistent with the significance of the event — the more minor the event, the shorter the time frame should be.
Deconstruct a large complex question into a series of smaller questions which will allow the respondent to spend more time on each element of the question.
Consider using retrieval cues, such as asking the respondent to think of actions that are associated with the event. (e.g. taking prescription medications, missing work, and staying in bed can act as recall cues for visits to the doctor).

3. Write questions so that respondents can translate their answers into the required format

3.1 Be clear about the desired answer format

Respondents should be able to easily tell what kind of answer is required, what level of detail is required, and how respondents should categorize open-ended answers.
Example:
Poor question: “How long ago did you leave your job?”
The problem: The format of the time period isn’t specified, whether it should be reported in days, months, or years.
Better question: “How many months ago did you leave your job?”

3.2 Ensure response options are appropriate and obvious

Don’t make respondents work to figure out how to answer your question — the response options should be clear from the way the question is written.
Example:
Poor question: “In the past 12 months, did your doctors treat you with respect (YES/NO)?”
The problem: This question is formatted as a binary question when the reality can be variable and nuanced.
Better question: “In the past 12 months, how often did your doctors treat you with respect?”

3.3 Provide mutually exclusive and exhaustive options for closed-ended questions

In other words, there should be a response option for everyone, and everyone should have only one response answer that fits their situation best.
Poor question: “Are you currently: married, separated, divorced, widowed, living with a partner, or have you never been married?”
The problem: There are multiple ways this question can be answered by someone who has never been married and lives with a partner, or is separated and lives with a partner, etc.
Better question(s): 1. “What is your current marital status (married, separated, divorced, widowed, or never been married)?”, and 2. “Are you currently living with a partner/your spouse?”

4. Write questions respondents are willing to answer honestly and accurately

There are many reasons why a respondent may not want to answer certain questions honestly. Concerns about anonymity and confidentially (i.e. being worried the information they provide will be shared with others), personal image (i.e. wanting to present a good image to others), and/or a desire to be properly classified (i.e. if the correct answer is seen as potentially leading to an incorrect conclusion, respondents might feel like they need to distort their answer in order to receive the right classification) can all lead respondents to purposefully or inadvertently distort their answers.
One method to mitigate these issues is to provide sufficient information and reassurance about confidentiality and anonymity through informed consent. Within the questionnaire itself, de Leuuw et al. [
] have a few suggestions on how to provide context within your questions in order to appease concerns regarding how responses will be interpreted and classified.

4.1 Include an introduction to the question

For example, “Many people find they do not exercise as much as they want to because of other work, family, or other responsibilities. Would you say you exercise as much as you would like?”

4.2 Think carefully about how your questions are organized

For instance, including alcohol in a list of other substances that can be abused (such as marijuana, cocaine, and prescription drugs) will provoke a different reaction than would including alcohol in a list of healthy behaviours (such as exercise, heaving regular check-ups, and brushing your teeth).

4.3 Provide inclusive response alternatives

For example, when asking a question about hours spent watching television per day, make the maximum response much larger than would be expected (such as 10 hours instead of 4), in order to avoid participants distorting their answer to avoid being seen as an ‘extreme’.
Although following these tips will help improve questionnaire design and clarity, and conducting are key steps to creating an effective survey (for more information on testing your questions, see de Leuuw et al. [
]).
For additional information on effective question-writing, refer to Dillman, Smyth & Christian’s [
] or free online resources like this:
In our next post, we’ll discuss how the organization and layout of questionnaires can be improved to make your survey study even more effective.

Further reading: Pros & Cons of Survey Studies

Advantages of a Survey:

Provides a description of participant characteristics
Is economical in collecting data from a large number of participants over a short time period
Allows responses to be easily recorded and aggregated for analysis
Facilitates data collection from multiple, geographically-distributed locations [
].

Disadvantages of a Survey:

Is based on self-reporting, and relies on participants’ ability and willingness to answer questions honestly and completely
Presents concerns about reliability (i.e. how dependable, stable, and consistent the questionnaire is when repeated under the same conditions — such as a questionnaire given to the same people at two different time points)*
Raises concerns about validity (i.e. how much the questionnaire measures what is intended to measure — such as barriers and facilitators)*

Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.