Sawtooth Software: The Survey Software of Choice

Achieving Better Response Rates for Disk-By-Mail Surveys

The following is an excerpt from an article entitled "Best Practices in Disk-By-Mail Surveys" by Karlan Witt of IntelliQuest, Inc. and Steven Bernstein of Apple Computer, originally presented at our 1992 Sawtooth Software Conference. The complete article is available for downloading from our Technical Paper Library on our home page: www.sawtoothsoftware.com.

Typical Response Rates on DBM Studies

Response rates on IntelliQuest disk-by-mail studies have ranged from 35% for an over-surveyed group conducted during the summer, to 70% when a high profile client was disclosed as the sponsor. With this potential 2X difference in response rate, it is important to heed all factors affecting response rate.

The most influential factor to impact response rates is the disclosure of a highly respected corporate sponsor. The next most influential aspect is the sample itself. Very senior executives, decision makers, and employees with select functions will produce lower response rates. The reported survey length also affects the response rate dramatically.

Potential Bias in DBM

Achieving a high response rate is beneficial in two ways:

  • it decreases cost per completed interview
  • it increases representativeness of survey results

For DBM surveys, even more than for other types of data collection methodologies, non-respondents are potentially systematically different from respondents in at least one way: their access to personal computers. Although respondents may be screened for access to PCs, this may introduce a source of non-response bias.

Factors Affecting Response Rate

The following are critical aspects to consider:

1. Saliency of survey topic. The more interesting and relevant the topic is to the target audience, the higher the response rate.

2. Length of survey. Two components of survey length affect response rate. The first is the expected length of time to complete the survey, if reported to the respondent in the cover letter. This eliminates certain respondents who are unwilling to commit that time to the interview. The second component is perceived time elapsed while taking the survey. Some respondents may begin an interview, but they may terminate if they perceive the survey is too long. It is important to note that for disk-based surveys, respondents' perception of elapsed time is less than the actual time lapsed.

The shorter the survey, typically the higher the response rate. This is a critical component to gaining an appropriate response from the over-surveyed populations and the respondents who place a high value on their time.

3. Respect for respondents' time; high professional ethics. Although respondents will accept longer interviews under a DBM methodology, the researcher must respect respondents' time.

4. Composition of research sample. Certain populations, such as purchase decision influencers and senior executives, are asked to participate in many surveys, and others place a very high value on their time. These groups typically demonstrate lower than average response rates.

5. Access to personal computers. The majority of DBM surveys are conducted on IBM-compatible personal computers. Whether a Macintosh survey software diskette is offered as an additional option depends largely on the target audience and the objectives of the research. In either case, respondents must be known to have or must be screened for access to a personal computer.

Depending on the subject matter being measured, respondents without access may or may not be different from respondents with personal computers. Respondents without access to PCs should be asked to respond to primary demographic and firmographic questions, as well as attitudinal questions about the subject being measured to analyze the potential for bias in the non-respondent sample.

6. Convenience of taking the survey. A DBM survey permits completing the survey at a time of the respondents' choosing. This convenience provides an advantage for DBM surveys over telephone or other data collection methodologies, and produces a higher overall response rate. Additionally, providing all materials necessary for the respondent to complete and return the survey (such as the postage-paid return disk mailer) will increase response rate.

7. Sponsorship of survey disclosed. One of the key factors affecting response rate is whether the sponsor of the research is disclosed. While it is clearly not appropriate in most studies, disclosure will increase the response rate, especially if the sponsor is respected by the target audience, such as in product follow-up surveys.

Disclosing the sponsor may also benefit the sponsoring company. In one customer satisfaction study, 35% of respondents stated that their attitudes toward the sponsor improved as a result of receiving the survey from the sponsor.

8. Guarantee of anonymity or confidentiality. Mailed surveys in general offer respondents some degree of anonymity; lack of anonymity is often a source of non-response in other data collection methodologies. This anonymity helps both on item non-response and unit non-response level.

9. Priority or First Class mail. Respondents react to a package as soon as it arrives. The packaging and professional appearance of the package and its contents will be the respondents' first impression. The goal is to have respondents complete the survey immediately.

In one IntelliQuest study, a split sample was used to test the effect of First Class vs. bulk rate postage on response rate. The response rate from the sample using First Class postage was 32%, while the response rate for the bulk rate sample was 27%. In debriefing with respondents from another study, it was found that faster mailing methods (for example, Federal Express or USPS Priority Mail) connote that the survey is of great importance to the sponsor of the research, and the respondents are therefore more likely to respond, and respond soon after receiving the survey.

10. Personalized cover letters and envelopes. This is a specific illustration of the packaging discussion above. The more professional the packaging and presentation from the research sponsor, the higher the response. While personalized cover letters increase response rate, even small typographical errors in the cover letter may have an adverse effect on response rate.

11. Incentive. Incentives are one of the most interesting and most debated response rate enhancers in survey research. Most sources report that incentives of any kind increase response rate. To examine the effect of offering incentives, IntelliQuest performed an experiment where potential respondents were randomly assigned to one of two groups. One of these groups was offered a coffee mug as an incentive for responding. The other was not offered an incentive. The promised incentive increased the response rate from 45% to 54%.

Selection of incentives may also affect response. Incentives should be appealing and motivating to the target respondents. Incentives may be job-related, such as an executive summary of the research results or a chance to win office equipment, or personal, such as a chance to win cash, a trip, or other such prizes. We have found that a choice of prizes is effective, particularly when the choices consist of targeted prizes. For instance, early adopters of technology respond to high-end technology gadgetry.

In targeting incentives, be cautious not to offend the intended respondents. In a debriefing of Fortune 500 senior executives, we have found that some respondents felt the use of a $1 bill was insulting to them, considering the value of their time; conversely many thought $1 communicated that the survey was important to the survey sponsor.

12. Pre-notification/pre-screening. In many studies it is necessary to contact respondents in advance of the mailed survey to:

  • Identify the individual who should receive the survey
  • Pre-qualify individuals for the study
  • Identify to which market segment, or quota group a respondent belongs
  • Screen for access to a personal computer
  • Verify address

Even when it is not necessary to conduct a pre-screening call, we have found that pre-notifying respondents, either by mail or phone, increases response rate. Pre-notification legitimizes the survey and communicates its importance to the survey sponsor.

Additionally, pre-qualifying respondents by telephone ensures that all respondents receiving the survey are eligible to participate. If non-qualified respondents receive survey disks and do not respond, they may be counted in the non-response category. It is not non-response bias if an unqualified respondent does not respond.

It is important for respondents to receive the survey package soon after the pre-notification. For a telephone pre-notification, we have found it most effective for respondents to receive the package within two to three days. With written pre-notification (letter or postcard), we have found it most effective for the package to be received approximately five to seven days after the notification.

13. Second mailing or follow-up postcard or phone call. As with pre-notification, a reminder call or postcard increases response rate. This follow-up may be used to thank respondents if they have already responded, and gain share of mind among those who have not yet responded. In one DBM study, the use of reminder phone calls almost doubled the response rate with a difficult-to-survey population.

Effect of Incentives/Reminder Postcards

Incentives and reminder postcards are an effective way for increasing response rates. To study the effects of incentives and reminder postcards on response rate, we conducted a study in 1987 where four groups were selected to receive a combination of a $1 incentive/no incentive and a reminder postcard sent five days later/no reminder postcard.

As shown in the following graph, the group that received both the $1 incentive and the reminder postcard had a 46% response rate. The group that received neither had a 33% response rate. The group that received $1 incentive and no reminder had a 39% response rate, and the group that did not receive an incentive, but did receive a reminder postcard had a 34% response rate. For this study, it seems that a $1 incentive worked well by itself and better in conjunction with the reminder card. The reminder card, when used alone, increased response rate only slightly.

We have found that incentives typically pay for themselves because the increased response rate requires fewer survey packages to be mailed to achieve the same number of completed interviews.

Response Rates For Experimental Study
$1 Incentive/Reminder post card 46%
$1 Incentive/No reminder post card 39%
No Incentive/Reminder post card 34%
No Incentive/No reminder post card 33%