How to determine the right survey length

6th June 2016 14:54

This blog was written by Matt Walker, Senior Research Executive

The length of online surveys can be something of a frustration for both researchers and clients. The client will, of course, want to squeeze as much value out of a research project as possible, which inevitably ends up with packing as much as possible into a questionnaire. On the other side is the researcher, who wants to keep the questionnaire restricted in order to reduce the length and, as a result, increase response rates. But exactly what impact does having a long questionnaire have on these response rates? And what can be done about this impact?How to determine the right survey length

A panel provider recently ran two surveys, with similar samples and no screening criteria, but with varying lengths – one lasting 10 minutes, and the other lasting 30 minutes. They found that the 30 minute survey had a drop out rate over three times that of the 10 minute survey (40% v 12%), and that the number of overall completes were almost three times less than the 10 minute survey (122 completes v 320 completes).

Respondents also rated the longer survey less positively (2.8/5 stars for the 30 minute survey, 4.3/5 stars for the 10 minute survey). But what about the quality of the responses? In terms of straight-liners (people who respond to all questions with the same answer in order to complete the survey as quickly as possible), 5% of respondents straight-lined in the 10 minute survey while 20% of respondents straight-lined in the 30 minute survey.

So the length of survey doesn’t just have a negative impact on response rates and drop out, it also yields poor quality data. But what can actually be done to avoid these issues? Is length of survey the be all and end all?

1.    Review the questions in the survey.

  • Are they all completely necessary for this research project? For example, a set of questions that doesn’t directly relate to the survey topic might be better placed in a separate survey altogether.
  •  Does each question provide meaningful insight? For example, a question on a tracking project might have yielded similar data for some time, indicating that the insight gained from this question is low.
  • Could some of them be reworded to still gather the same information, but in a more economical way? Sometimes multiple questions are asked about a similar topic, whereas rewording could result in just one or two questions being required.
  • Are any demographic questions in the survey that are also in data that has been collected previously? For example, sample from a panel often already has this data.


2.    Make it enjoyable to take part in.

  • Is it overly complicated? Simplifying the survey reduces the mental effort required and reduces likelihood of drop out.
  • Are the questions in a logical order? Splitting the questions into topic areas and making each section clear in the survey can help participants to refocus and clear their mind before answering questions about a different topic.
  • Is there an on-going dialogue with the respondent? For example, text substitution referring to a respondents’ previous answers can help make the survey feel more personal and that their responses are being noted.
  • Are images, interactive icons and gamification included to boost respondent morale? These can be used to break up the often monotonous task of answering many text-heavy questions – even simple things like colour can help.


3.    Be honest about the length.

  • How long will it take realistically? Testing the survey as a respondent and timing how long it takes to complete can give you an indication of this.
  • Can you communicate a likely screen out rate from the outset? This can be estimated based on the number of screening questions you have, and looking at previous surveys with similar audiences.

Share this article...

© DJS Research 2024