This summer we've been working on the next version of ACBC, which will include alternative-specific designs plus an overhaul of the design generation algorithm. We've implemented an extremely fast level relabeling and swapping routine that uses the previous designs as a starting point and then improves them substantially. The good news is that we’re finding that we can boost the efficiency of ACBC designs (at the individual level) by about 5% to 40% depending on the study. The biggest gains are seen for designs where most or all of the attributes are dropped from the BYO section. (We hope to release this new software in Q4.)
We’re obtaining these gains in efficiency (D-efficiency) while maintaining the same degree of purposeful oversampling of BYO-selected levels (near-neighbor, pivot designs) as before. The gains are mainly due to finding better combinations of levels to reduce the correlation within the design matrix. (Though we’re also doing better in terms of 1-way level balance for the non-BYO selected levels in the design.)
In addition to the improved efficiency of the new design algorithm, the next version includes a setting for requesting that the experimental designer avoid generating dominated concepts. (A dominated concept is one that is clearly inferior to another product concept.) Also, you can force the BYO-selected concept into the choice tournament. Plus, you can now test your designs by asking the software to automatically generate hundreds of respondents. Actually, kinda fun!
We don’t know yet how much practical benefit will be seen through these improvements to ACBC. Given that we’re boosting the D-efficiency while maintaining the same degree of near-neighbor focus in the designs (and cognitive complexity of the surveys), we expect it can only be a good thing. We plan to conduct a split-sample test soon to investigate.