6 Types of Survey Biases and How To Avoid Them
This post will highlight six types of survey biases to be cautious of when creating a market research questionnaire or when conducting any form of survey research.
Table of Contents:
- Sampling bias
- Non-response bias
- Acquiescence bias
- Social desirability bias
- Question order bias
- Interviewer bias
Sampling bias, also known as selection bias, is a type of bias in which a researcher gathers a sample of respondents for a questionnaire that does not accurately represent the population. As such, the survey results cannot accurately be used to make claims about a general consumer base. This form of bias could be a result of limited consumer accessibility to a survey, not accounting for a specialized/niche topic, or the location of those sampled.
How to avoid sampling bias
Fortunately, there are simple ways to avoid sampling bias/selection bias in your research efforts. Here are a few tips to consider:
Ensure your sample is randomized
This means all qualifying participants have an equal chance of being selected. To do this, use a random number generation or another form of random selection. Many panel providers use randomized sample selection when sourcing respondents to account for selection bias.
Set quotas for your target sample
This ensures you are not sourcing from one specific demographic or group of people (i.e. all females, all millennials, etc.). Enforcing survey quotas means your survey respondents come from many different backgrounds, with varying perspectives.
Keep online surveys short and accessible to all
For example, ensure that surveys are mobile-friendly for survey respondents who may not have access to a computer, and are short enough so that even those with busy schedules or other jobs have the option to participate in your questionnaire.
When you send a survey out to a large group of people, you have the expectation that some won’t respond. But when that group of non-responders represents a large portion of opinions that vary drastically from those you have captured, that’s what’s known as non-response bias.
A common example of non-response bias is seen during elections. Those who can’t make it to the polls due to polling location availability, work schedules, or childcare, may have opinions that are meaningfully different from those who show up to the polls - leading to a biased skew in the results. An example of non-response bias in the online survey world might be seen with lengthy surveys and incentives. Let’s say for example younger generations are open to taking longer surveys for the incentives and perks; when they realize a survey doesn’t offer one, they might drop out. Meanwhile, older generations may take surveys simply because they enjoy them, and don’t necessarily rely on an incentive. In this case, you are missing out on valuable insights from younger generations whose perceptions could vary drastically from older respondents.
Non-response bias can also be seen with sensitive information. If respondents feel they are being asked questions that are too personal, without the option to skip, they may just drop out of the survey altogether.
How to avoid non-response bias
Here are a few ways to avoid non-response bias in your market research surveys:
If you’re looking to capture a specific audience of survey takers, make sure those potential respondents are the ones receiving your survey. For example, college students are probably not going to click on a survey about work-life balance, just as homebodies are not as likely to click on a survey about traveling.
Panel companies are useful for this purpose, as they often have a set of pre-identified targets for commonly-sought survey audiences.
Keep it short
Lengthy surveys have a tendency to lead respondents to quit the survey before they’re finished. If your survey needs to be longer in length, consider adding a progress bar to the top of the window so survey takers at least know where they are in the survey. Another option is to add section breaks throughout the survey, letting the respondent know how many sections they have left.
When it comes to sensitive questions like political affiliation or income, always provide the option for a respondent to skip over that question, rather than leading them to quit the entire survey.
Ensure survey delivery
Before sending your survey out to participants, send a test link to yourself or a colleague. Test it on multiple devices and browsers to emulate what a participant is going to be seeing on their end.
Once you confirm the links are working properly, keep track of the surveys you are sending out and the response rate you’re getting back. Online companies can often do this with download data or click-rate data while mail-in survey companies can monitor returned surveys that don’t reach their intended destination. Online surveys will also benefit from being mobile-friendly, to ensure those who are always on the go can easily respond.
In addition to monitoring the delivery rate of your survey, you can also send follow-up reminders. If you send your survey out and someone happens to be traveling, they may forget about it even though they are interested in responding to the questionnaire. A reminder will move the survey back up in their inbox and give them another option to participate - thus improving your response rate.
Acquiescence bias is also known as agreement bias, where respondents have the tendency to lean toward positive responses more frequently than negative ones.
For example, acquiescence bias might appear when a respondent feels indifferent toward a topic but they select ‘strongly agree’ because they may feel that’s the ‘right’ answer - even though it doesn’t actually reflect their sentiment.
Reasons for acquiescence bias are vast; some may carry their ‘always aim to please’ mantra with them into surveys, while others may try to ‘manipulate the system’ by thinking responding favorably will further qualify them for the survey and avoid getting kicked out. A question’s phrasing can also impact whether or not a respondent has acquiescence bias.
How to avoid acquiescence bias
The following tips can help reduce the chances of acquiescence bias from your survey takers:
Avoid leading questions
When you start a question with “How much do you agree...” you could be priming a respondent to feel they need to have a polar feeling, when in fact they may feel neutral about the topic. This could result in 'extreme responding' (on extreme ends of a Likert scale).
As an alternative, consider phrasing questions such as: “How do you feel about the following statement: ...”
When respondents know their data collection is not going to be tied to their names or any parts of their identities, they may be more inclined to answer honestly rather than favorably. Using a disclaimer is especially useful in work settings, for honest and accurate feedback from your employees.
Include red herrings
While this won’t avoid acquiescence bias, it is a way to identify when this sort of bias has taken place within your survey data. For example, say you have a matrix question with a Likert scale across the top and statements down the side; if a survey taker selects they ‘agree’ with both of the following two sentences, it’s likely they are exhibiting acquiescence bias and you can flag them in your data set:
“I love to drink coffee”
“I hate to drink coffee”
Social desirability bias
Social desirability bias is a form of survey bias in which respondents answer questions in ways they think will be viewed favorably by others. It is similar to acquiescence bias in that respondents report metrics that don’t necessarily reflect their true sentiments, but for a different reason. While acquiescence bias is typically limited to agreement biases, social desirability bias is a bit broader - not limited to agree/disagree or yes/no.
For example, survey respondents may underreport their alcohol intake or smoking frequency because society views high volumes of these activities negatively. Or, survey takers may give inaccurate answers about how frequently they work out at the gym. When social desirability bias occurs, it can lead to a problematic skew in your survey data.
How to avoid social desirability bias
To avoid social desirability bias in your data set, follow these tips:
Consider survey response options
Rather than having a respondent pick from a defined scale, have them type in a value or a phrase themselves. For example, have respondents type in the number of days they work out per week in an open numeric table, rather than providing options to choose from a list. This use of open-end formatting could lead to more accurate data.
Keep it anonymous
Like acquiescence bias, respondents are more likely to respond authentically if they know their identities are not tied in any way to their survey responses. Provide respondents with a disclaimer at the beginning of your survey that their data will only be measured in aggregate, never on a personal level.
Ask neutral questions
If a respondent gets the idea that one end of a scaled question would be considered more favorable, they may lean toward that one when answering. For example, instead of asking ‘How much do you like cats?’, asking ‘Which animals do you like best?’ is a more neutral way to understand how people feel about cats.
Also be sure to avoid extreme wording in your questions that can be considered positive or negative, such as ‘How much time do you waste on your phone?’. The word ‘waste’ has a negative connotation in this type of question, which could easily sway how a respondent reacts. Instead, pose the question such as: ‘How much time do you spend on your phone?’.
Question order bias
Question order bias occurs when the flow of survey questions influences how a respondent will react. Asking certain questions early on in your survey design can sway a respondent into how they later answer questions. As a simple example, asking respondents about Netflix and then in a later question asking them to name streaming platforms could show bias toward mentions of Netflix.
You can think of question order bias almost as leading questions. It’s prepping a respondent so they already have something top of mind, rather than capturing their true conscious thoughts.
How to avoid question order bias
These are a few ways to avoid question order bias in your research:
Start with broad survey questions
When honing in on a topic, always start general and narrow in. For example, if you’re surveying respondents about breakfast cereals, first start by asking if they typically eat breakfast, then ask what kinds of breakfast foods they eat, then ask about cereal brands they may consume.
Contrarily, if you started by asking respondents which cereals they consume from a list of cereal brands, and then later ask what they typically eat for breakfast, they will likely say cereal - because it’s on their mind from the earlier question; that’s question order bias.
Randomization is helpful when it comes to choice questions (i.e. multiple choice). Randomizing a list of answer options helps to mitigate the risk of respondents choosing an option simply because of its location (i.e. first in the list or last in the list).
Randomization is also useful for entire questions; this is especially useful during concept testing where you are showing visual stimuli. If you’re testing multiple images or videos, and respondents are always seeing a certain concept first, that could be influencing how they feel about the second one. By using randomization, your respondents won’t all see the same concepts first, providing more validity to your findings and removing potential question order bias.
As the name implies, interviewer bias is a form of survey bias in which a moderator’s own opinions interfere with the feedback from a respondent during qualitative survey interviews (i.e. video surveys, focus groups, etc.). This type of bias could be positive or negative, intentional or not, but regardless it’s a form of bias to be aware of.
Interviewer bias can come in many forms - stereotypes, demographic profiling, confirmation bias (in which the interviewer seeks to confirm a pre-conceived notion they have about a candidate), or recency bias (in which the interviewer shows a preference for the most recent candidates they interviewed rather than considering the total candidate pool). These are just a few examples of interview bias. Below are some tips on how to avoid it:
How to avoid interviewer bias
Interviewer bias can be avoided or reduced by following these best practices:
Switch things up
Use a pool of moderators or interviewers (rather than just one) in qualitative research environments to diversify opinions, personalities, and potential implicit biases.
Use an interview guide
By providing interviewers with a guided set of questions, there’s less risk of bias inserting its way into a conversation. In addition to providing a set list of questions, a guide might also include neutral responses/reactions to keep the conversation flowing in a natural way.
Schedule peer reviews
In addition to switching up interviewers and sticking to a guide/script, it can be beneficial to periodically have peer review sessions in which a colleague sits in on another moderator’s interview process and shares notes afterward indicating where potential biases may have come to light; this is helpful for reflection so that the moderator is more aware of such biases during their next interview.
As humans, we all have our own biases; being aware of these biases, especially in market research, will ensure valid, truthful responses from your survey sample. quantilope prides itself on high-quality data, which includes a focus on survey bias. One way to reduce the risk of survey bias is through the use of quantilope’s automated market research survey templates. These templates have been thoroughly thought-out and designed by quantilope’s data science team - taking into consideration question order, question phrasing, methodology, and respondent experience.
Clients can leverage these various types of survey templates to feel confident their survey research is set up correctly and in an unbiased manner. To learn more about quantilope’s automated survey templates, or about how to limit types of bias in survey research, get in touch below: