Researching Usability

Posts Tagged ‘surveys

Bias is an issue that anyone gathering user data is weary of. Whether its usability testing, face-to-face interviews or online questionnaires, bias can affect the strength and integrity of results in a variety of ways. Question design is one of the most influential factors should therefore be given careful consideration. Leading questions can inadvertently give participants an idea of the desired answer and influence their response. However, sampling bias can also have a significant affect on the research results and is often overlooked by researchers.

I was reading Europeana’s Annual Report this week and noticed that the results from their online visitor survey was on the whole very positive. Reading the survey report in more detail I realised it was possible that sample bias may be affecting the survey results. Data from online visitor surveys are normally gathered using an intercept which invites a visitor to participate in the research when they arrive to the site. Anyone visiting the site is who receives this invite is eligible to participate making them ‘self-selected’. This means that they decide to  participate, not the researcher. Their motivation for participating may be related to the topic of the survey or the incentive provided to garner interest.  Consequently their participation is unlikely to provide a repserentative sample.

For example, those who participated in Europeana’s survey are more likey to be motivated by their enthusiasm and interest in the website. Certainly those who are apathetic or indifferent to the website are less likely to have participated. This is supported by the proportion of participants who were regular visitors to the site. Only 8.6% of participants were first time visitors and the results from these participants was generally more positive than the participants who had visited the site before. It would be interesting to find out if a larger sample of first time users would alter these results.

So what can researchers do to prevent sample bias in their results? It is very difficult to completely remove sample bias especially in online surveys where the researcher has no control over who participates. Generally speaking visitor surveys will always carry the risk of bias so the aims of the survey should take this into account. Designing a mixture of open and closed questions will provide some insight into the participant’s motivation. Descriptive answers which require more thought are less likely to be fully answered by those motivated by the incentive. It also provides the added benefit of giving users the opportunity to provide their own feedback. It is interesting to note the Europeana did not do this, leading some participants to email their comments to the researchers. Providing an optional section at the end of the survey for final comments could have provided rich feedback not obtained through closed questions. Indeed the comments Europeana received often described situations where users’ had trouble using the site or disliked a particular design feature.

Avoid asking questions which relate to the user’s overall opinion of the system before they have used all the features as it will not provide accurate results. For example, 67% of users stated they had never used the “My Europeana” feature before and were therefore unable to provide feedback on it. Usability testing often provides more insight into these issues by gathering this information retrospectively after asking a user to carry out tasks using the site. If it’s possible to use survey software which can do this then it is recommended because it is more likely to gather meaningful results. It is only after trying to complete a task that a user will be able to accurately describe their experience.

It is worth noting that Europeana have also conducted user testing with eyetracking in addition to focus groups and expert evaluations. The results of these are due to be published soon and I look forward to reading them. It will be interesting to compare the results against our heuristic inspection of Europeana and other DLs.


del.icio.us bookmarks

Archive