At the 2007 Sawtooth Software Conference, we presented our latest research into Adaptive CBC. Our current stream of research is a significant departure from earlier Adaptive CBC approaches we have tried and described at earlier conferences. And, we’re happy to say, this approach seems to work better than the traditional CBC for complex studies involving about five or more attributes. Importantly, respondents find the interview more engaging, realistic, and focused on levels more relevant to their choices.
Our approach involves first asking respondents to indicate which product they’d most likely purchase. We’ve formatted that phase as a BYO (configurator) task, but it probably could also be done as an indication of “most likely” or “preferred” levels for attributes. In the second phase of the adaptive interview, we ask respondents to screen product concepts that resemble their “most likely” product. Respondents indicate whether each concept is a possibility or not. After a few choices, if the respondent seems to be using non-compensatory rules (i.e. “must have” or “unacceptable” levels), we identify the possible rule and ask the respondent to confirm or skip the rule. This process of observing respondent choices and following up with the opportunity to define decision rules is repeated. And, of course, if the respondent indicates that a particular level is either required or unacceptable, only products meeting the criteria will be shown throughout the remainder of the interview. In the final phase, we ask the respondent to compare screened-in products using standard CBC choice tasks. This is a round-robin tournament that identifies an overall winning concept.
To complete an online Adaptive CBC survey yourself, visit: ACBC Sample Surveys
To this point, we have completed two studies. We have published the findings in the Technical Papers library on our website, in an article entitled: A New Approach to Adaptive CBC (2008)
What are we doing now? We are currently wrapping up a third study, and will be presenting the results at the 2008 A/R/T Forum Conference in June. In this latest research, we are testing whether the Adaptive interviews can be made significantly shorter without losing much in terms of predictive accuracy. We’ve implemented an improvement to the experimental designs that may allow us to get away with shorter interviews, so this latest work involves more than just shortening the interview.
We are also in development of a beta version for Adaptive CBC. Because this is such a new approach, it is critical that we obtain more data points and greater experience prior to launching a commercial v1 product. We’ll announce the beta software when it is available, and will be enlisting your help to further test this promising methodology.