Researching Usability

Sampling bias in online surveys

Posted on: March 5, 2010

Bias is an issue that anyone gathering user data is weary of. Whether its usability testing, face-to-face interviews or online questionnaires, bias can affect the strength and integrity of results in a variety of ways. Question design is one of the most influential factors should therefore be given careful consideration. Leading questions can inadvertently give participants an idea of the desired answer and influence their response. However, sampling bias can also have a significant affect on the research results and is often overlooked by researchers.

I was reading Europeana’s Annual Report this week and noticed that the results from their online visitor survey was on the whole very positive. Reading the survey report in more detail I realised it was possible that sample bias may be affecting the survey results. Data from online visitor surveys are normally gathered using an intercept which invites a visitor to participate in the research when they arrive to the site. Anyone visiting the site is who receives this invite is eligible to participate making them ‘self-selected’. This means that they decide to  participate, not the researcher. Their motivation for participating may be related to the topic of the survey or the incentive provided to garner interest.  Consequently their participation is unlikely to provide a repserentative sample.

For example, those who participated in Europeana’s survey are more likey to be motivated by their enthusiasm and interest in the website. Certainly those who are apathetic or indifferent to the website are less likely to have participated. This is supported by the proportion of participants who were regular visitors to the site. Only 8.6% of participants were first time visitors and the results from these participants was generally more positive than the participants who had visited the site before. It would be interesting to find out if a larger sample of first time users would alter these results.

So what can researchers do to prevent sample bias in their results? It is very difficult to completely remove sample bias especially in online surveys where the researcher has no control over who participates. Generally speaking visitor surveys will always carry the risk of bias so the aims of the survey should take this into account. Designing a mixture of open and closed questions will provide some insight into the participant’s motivation. Descriptive answers which require more thought are less likely to be fully answered by those motivated by the incentive. It also provides the added benefit of giving users the opportunity to provide their own feedback. It is interesting to note the Europeana did not do this, leading some participants to email their comments to the researchers. Providing an optional section at the end of the survey for final comments could have provided rich feedback not obtained through closed questions. Indeed the comments Europeana received often described situations where users’ had trouble using the site or disliked a particular design feature.

Avoid asking questions which relate to the user’s overall opinion of the system before they have used all the features as it will not provide accurate results. For example, 67% of users stated they had never used the “My Europeana” feature before and were therefore unable to provide feedback on it. Usability testing often provides more insight into these issues by gathering this information retrospectively after asking a user to carry out tasks using the site. If it’s possible to use survey software which can do this then it is recommended because it is more likely to gather meaningful results. It is only after trying to complete a task that a user will be able to accurately describe their experience.

It is worth noting that Europeana have also conducted user testing with eyetracking in addition to focus groups and expert evaluations. The results of these are due to be published soon and I look forward to reading them. It will be interesting to compare the results against our heuristic inspection of Europeana and other DLs.


3 Responses to "Sampling bias in online surveys"

[…] Sampling bias in online surveys March 2010 5 […]

I have been looking for an appropriate place to make a comment about surveys. I have been receiving them via for years and they have consistently sought out hispanic respondents. I know this because I frequently get rejected from the survey, when I respond to the question of my ethnic background (caucasian, non-hispanic).
Because of this, I don’t think the surveys have any credibility at all. They seek to emphasize hispanic respondents, in order to get the results that they want to see. How anyone can respect that, is beyond me. It’s much like what the schools have done for the last couple of decades, trying to record students who have a very small amount of Spanish heritage (i.e., 25%, or less) as hispanic, to get funding for a protected (i.e., kid glove treatment) group.


You’re totally wrong in your assessment. I’ve just conducted an online survey, and found out for some reason by far the hardest population to get ahold of is Hispanics. Surveyors will OVERsample under-represented, or hard to get, demographics such as Hispanics on purpose in order to get a more statistically significant and accurate measure of that subgroups attitudes. THEN, that demographic’s numbers will be weighted according to the whole population demographics they are surveying.

It has nothing to do with playing to your audience to get the response you want from a demographic you want. It has everything to do with getting a statistically usable sample of a small subgroup, and then bringing that data in line proportionately with the actual percentage of that subgroup found in the population you are testing.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s bookmarks

Twitter feed


%d bloggers like this: