When planning a choice-based conjoint study, one must decide how many choice tasks to give each respondent. This is an important issue because we know that if the interview is too long, respondents can get fatigued or bored, and their answers may be of little value. But, we are motivated to collect as much data from each respondent as possible to maximize the impact of each dollar spent on field work.
At the AMA's 1996 ART Forum, we reported results of a project undertaken to shed light on this question. We re-analyzed data from 21 commercial CBC studies, to see how results would depend on the number of tasks respondents are given.
Data sets were contributed by Dimension Research, Griggs-Anderson Research, POPULUS, IntelliQuest, McLauchan and Associates, Mulhern Consulting, and SKIM Analytical, as well as several end-users of CBC data. The studies included a wide variety of product categories ranging from beverages to computers and airplanes. They involved field work done in several countries and languages. The number of attributes ranged from three to six, and the number of choice tasks ranged from 8 to 20. The numbers of respondents ranged from 50 to 1205, and altogether they contained approximately 100,000 choice tasks.
Because these data sets were not designed for methodological purposes, most did not include holdout tasks that could be used to assess predictive validity. Consequently, our analysis has centered around the topics of reliability and internal consistency. Here are the main findings:
How many choice tasks should you ask each respondent? You can usually ask at least 20 choice tasks without degradation in data quality. Within that range, there is no evidence of increasing random error, and later tasks provide data at least as reliable as earlier tasks.
How much information is contributed by multiple answers from each respondent? Although there is no disputing the value of sample size, considerable gains can also be made from increasing the number of tasks per respondent. Within the ranges we studied, doubling the number of tasks per respondent is about as effective in increasing precision as doubling the number of respondents.Is there a systematic change in respondents' answers as the interview progresses? Do brand or price become more important? Do respondents become more or less likely to choose the "none" option? Yes to all three. Brand becomes less important, and price more so, and respondents are more likely to choose "none" as the interview progresses. These systematic effects are what limit the number of tasks each respondent should be given, rather than anticipated increases in random noise.
Should you ask for just the first choice for each set of concepts, or is it useful to ask for second choices as well? Second choices provide more information at less cost, but they are biased. We advise asking only first choices.
How long does it take respondents to answer choice questions? How long is an interview with a certain number of tasks likely to take? Choice-based conjoint interviews go quite quickly. Average response times ranged from about 40 seconds for the first task to 13 seconds for the last. Even for 20 tasks, the longest average interview time was about 7 minutes.
We were surprised by some of these findings. There are three main things that we've learned from this analysis:
- Before doing this study we were more concerned about burdening respondents with long questionnaires than we needed to be, though it still appears that very long interviews may produce distortions in brand/price tradeoffs.
- We had been impressed by the efficiency of asking for second choices, without adequate recognition of the bias inherent in their use.
- We had incorrectly suspected respondents often chose "none" to avoid difficult tasks, rather than because the offerings weren't attractive.
Fortunately, none of these surprises consists of bad news, and we think there is good reason for the enthusiasm with which choice-based conjoint analysis has been accepted by the market research community.
A copy of the complete study can be downloaded from the Technical Papers section of our home page on the Internet.