Sawtooth Software: The Survey Software of Choice

On June 26, 2017, Bryan Orme, the President of Sawtooth Software, Inc., was awarded the American Marketing Association's Charles Coolidge Parlin Marketing Research Award at the 2017 AMA Advanced Research Techniques Forum in Seattle, Washington. Below is his speech, transcribed from an audio recording, with minor edits to improve flow and grammatical construction.

Thank you for staying after the main sessions to indulge me in this amazing honor. I’ve already told many of you that when I got the email from the AMA about two months ago announcing that I was going to be receiving the Parlin Award…when I saw the subject line of the email I initially thought it would be asking for nominations, or that it would just be informing us about who had won the Parlin award. You start reading it and you stop, and pause, and back up. You read it again. You stop. You read it again…slower this time and make sure you understood that indeed it is saying what you think it is saying. It was a profound thing and such a surprise for me because I look up to so many of the Parlin award winners. Just to be associated with them is the greatest professional honor of my life.

It’s impressive as you look over the names of past Parlin winners: George Gallup and Arthur Nielsen—certainly those of us in the business of market research know those names. Back in grad school, reading Michael Porter’s Competitive Advantage, also Phillip Kotler’s marketing research texts that are so well known. Reading in grad school Jordan Louviere’s little green Sage book, Analyzing Decision Making. Of course, Paul Green’s influence. There are so many outstanding Parlin award winners that have come before.

When I learned that I was receiving the award, I said to my wife, “You won’t believe this, but the American Marketing Association is giving me the Parlin Award this year.” She looked at me and said, “Great. Tell me what that is.” And I told her about some of the people who had won the award. And, she was very, very impressed—especially when she heard that Greg Allenby had won the award <pointing to Greg in the audience>. Greg wore a coat when presenting earlier today, so I thought I had better go looking for a coat so I would at least look as good as he did when receiving this award.

I’m not standing here because of any superior intellect. Certainly not because of deep training in statistics, which so many of you in this room outshine me in. I’m going to speak for a bit about why I’m standing here and will pay tribute to people who have helped me along the way. I know this is going to feel like a walk down Bryan Orme’s memory lane. But, I think that there is something to appreciate about these individuals who went before us and something that we can learn from them. In gatherings like this we can latch on to those who can be our mentors and collaborate with us, because the ART/Forum is supposed to promote collaboration between academics and practitioners. Indeed I’ve been greatly blessed by it over the years.

I’m largely here because I belong to an organization that allows and actually has as part of its express mission statement to educate and share. Rich Johnson, the founder of Sawtooth Software, engrained that in us. Imagine if you could focus almost exclusively on just one area— and I tend to focus on choice modeling and conjoint analysis—and you can do that for over 20 years. Imagine if your organization gave you free reign with your time to be able to travel and teach about it, conduct research, write articles about it, and share openly. And nobody ever said to you, “I’m sorry you can’t share that. I’m sorry, that’s proprietary.” Imagine nobody ever said to you, “Don’t say that because we don’t want them to know exactly how we’re doing it.”

If nobody ever told you that and just said, “Go out and advocate.” And, you also got lucky enough to become involved with a research technique that has a lot of legs in the industry—for good reason. You could imagine that you could share a lot and do a lot of good. I think that’s why I’m here—because of my lucky circumstances to belong to such an organization and to have had such leadership that encouraged and put no limitations on our ability to share and be transparent. So I’m deeply indebted to Rich Johnson and Sawtooth Software.

I’m going to talk a little bit about some past Parlin Award winners: Rich Johnson, Greg Allenby, John Hauser, Steve Cohen, Jordan Louviere, and Paul Green. I’ll begin with Rich who won the Parlin award in 2002.

I first met Rich in 1993 when I was working at IntelliQuest right out of graduate school. He was a consultant to IntelliQuest and met with us quarterly. Rich has always taught us to share ideas openly and in the most simple and understandable ways possible. I remember one time he said of a particular person who had just given a speech, “He speaks in code.” And that meant that the person was purposely veiling what they were doing so it could not be understood. And that was of course something that Rich eschewed and adamantly was against. He didn’t want us to create anything that was proprietary or black box and couldn’t be challenged and scrutinized.

And Rich always said, “If somebody wants to come and challenge what we do at our organization, we’ll put them right at the top of the list for a speaking slot.” So he never shied away from that. Rich always said to make sure our presentations would deliver something of value to both the most sophisticated and the least sophisticated members of the audience. He wanted to be inclusive and make sure that everybody got at least something good out of it when he presented. Rich taught us to keep our eyes on academia and then to take those best ideas that will work every time in practice and make them available to practitioners. Working every time and being stable was something that Rich always prized. And, Rich wasn’t afraid to be pragmatic if it just plain worked in practice. We often call that “duct taping” around here but sometimes the duct tape works. And when you can prove it over and over again—yeah, we might not win any awards and get published in JMR, but we can certainly do really well by our clients and that’s something he always wanted to do. At the same time, Rich taught us to be open and honest about what works and what doesn’t. And, to be willing to admit if we’re wrong.

I remember one time Karlan Witt and I were traveling with Rich and we noticed a couple of times while walking through the hotel that in the lobby there was a plate of chocolate chip cookies. And Rich would find himself gravitating toward the chocolate chip cookies and taking one or two on his way by. And so the morning came and we got dressed to go out and visit the client. Karlan and I emerged from the hotel and we turned around and Rich wasn’t there. And I said to Karlan, “90% chance Rich has stopped to get a cookie.” So Rich emerges from the hotel and I asked if he’d stopped for a cookie, to which he said he hadn’t. And I said, “Well, I guess I was wrong.” And, he said, “What were you wrong about?” And I said, “I told Karlan there was a 90% chance that you had stopped to get a cookie. But you didn’t.” Rich replied, “Well, that didn’t necessarily mean you were wrong—it could have been the 10% of the time. “ It was a good insight and it’s something that we forget about sometimes because we usually like simple declarations one way or the other. But one of the things we find in our models is that we have probabilistic outcomes, or they have draws that paint a distribution of outcomes. We can express certain confidence intervals. We should focus on those kinds of expressions of probability rather than just point estimates.

Moving on, I want to talk a little bit about Greg Allenby who won the Parlin award in 2012. My first recollection of Greg was in 1994 when I was working at IntelliQuest. I heard that there was this professor that wanted to try out this hare-brained statistical approach for reanalyzing some conjoint data…and did we have some data to share with him. It was a small ACA dataset of maybe 300 respondents. And I think Greg was working on a Sun Microsystems mainframe at the Ohio State University, the biggest one they could get. And it took two weeks to do 3000 iterations or something like that. It seemed like a lot of smoke of mirrors. And Greg came out to the ART/Forum and started talking about hierarchical Bayes. It just seemed strange to so many of us who came from a different background.

I remember at one point Rich Johnson came to the realization that HB was the real deal and Rich said to me, “I’ve got to learn how to do this. I’ve just got to sit down and persevere until I figure it out.” And Rich didn’t have to sit down and figure it out himself because Rich attended Greg’s tutorial on HB here at the ART/Forum where Greg generously gave Rich his code for hierarchical Bayesian estimation. And a day and a half later, Rich called me on the phone and said, “I’ve got it. It’s working!” I said, “You figured it out?” And, he said, “Well, I had some help. “ Sawtooth Software is extremely indebted to Greg, because it certainly didn’t hurt our bottom line financially over the years and it also helped us an awful lot as an industry in being able to disseminate these methods. And, over the years other software has come along to further popularize it and make it accessible to everybody. So, it’s been a great thing. We owe Greg a great deal for that.

Another Parlin Award winner from 2001, John Hauser, came out to the Sawtooth Software conference and gave this presentation on non-compensatory decision-making in conjoint analysis <referring to the Hauser et al. 2006 Sawtooth Software Conference paper>. And it hit Rich and me at the time when we were banging our heads against a problem in trying to come up with an adaptive form of CBC. But we were focusing on a method of creating adaptive designs in CBC that would, on the fly, for each respondent, look at their previous answers, then try to maximize D-efficiency in designing new choice tasks. Part of the D-efficiency equation involves the respondent’s betas. And, so it ended up increasing D-efficiency dramatically at the individual level as well as utility balance. But, it couldn’t improve the predictions or the validity of the data any more than just standard CBC. That insight that John Hauser shared led Rich and me to have a lightning bolt of inspiration leading to a different and more successful version of adaptive CBC <see Johnson and Orme’s 2007 Sawtooth Software Conference paper>. And, so John Hauser was super helpful to us at that time.

Next I’d like to talk about Steve Cohen who won this award in 2011. I remember I was investigating the method of paired comparisons and doing some preliminary research on how the method of paired comparisons could beat the 5-point or the 10-point Likert scale in terms of measuring importance or preference. I was on the phone with Steve, and I said, “Hey Steve, what do you think about this? What do you think about the method of paired comparisons?” And Steve said to me, “I’ve got something that will knock the socks off the method of paired comparisons!” And I said, “What’s that?” and he says, “It’s Maximum Difference Scaling…best- worst scaling. It’s a Jordan Louviere invention.” And, I said, “Steve, why don’t we do some joint research on it and why don’t you present it at the Sawtooth Software conference.” And, we did so, and sure enough we found that MaxDiff worked better <see Orme and Cohen’s 2003 Sawtooth Software Conference papers>—I don’t know if it knocked the socks off it—but it certainly worked better than the method of paired comparisons and a heck of a lot better than those nasty 5-point, 10-point rating scales. And, then he and I were able to join forces to publish a paper in the Market Research Magazine in 2004 for which we were given the Hardin award. And, so, Steve assisted me greatly in that particular effort.

Jordan Louviere, Parlin Award winner in 2010…he’s been very generous to me personally and very generous to Sawtooth Software. I don’t think he ever viewed us as rivals to worry about competition. When questions came up—and others can say this in this room too—when we wrote Jordan via email to ask a question, he would often return his articles and even unpublished white papers. Initially they were faxes in the early days. Later it was emails. Many of us remember how gutsy he would be and how much fire and passion he would have both here at ART/Forum and at the Sawtooth Software conferences. He’s been a great influence on me as well, for which I’m grateful.

Paul Green, 1977 Parlin Award winner…just a beautiful person. I remember him playing piano for us at the Sawtooth Software conference. Super talented. He would sometimes send me copies of his latest papers with hand written messages…just giving me encouragement when I was young in my career. Also for writing the foreword of my Getting Started with Conjoint Analysis book. I’m super grateful for him.

Other people who haven’t yet won the Parlin award walk around these halls and have also been helpful to me, like David Lyon. He’s here <pointing to David at the back of the room>. I love the tagline in his email, the signature at the bottom, which says, “Statistics means never having to say you’re certain,” which of course has something to do with probability statements that I discussed earlier. I remember being delayed at the Chicago airport coming out of ART/Forum 2013. And, we had seen a presentation by Eric Schwartz on Thompson Sampling as a solution to the multi-armed Bandit problem. And, I sat down with David, and I said, “Hey, I think we can apply this to MaxDiff. But, I don’t quite understand how it works.” And, David did a great job just leading me through the pieces and telling me how it’s done in a way that was so crystal clear.

Tom Eagle is another person who unfortunately was not able to make it here today. When I would come here to ART/Forum I used to make it a point to take Tom Eagle out to dinner because I could just pepper him with questions. For example, he taught me about piecewise regression coding. I would often have to say, “I’m sorry, Tom, I don’t understand. Can you say it a different way.” And, he helped me out dramatically. Tom has never been a “yes man” but is always challenging things. He’ll hear something and he’ll challenge it. He’ll do the research to back it up.

Joel Huber also unfortunately wasn’t able to be here. Super firm support of ART/Forum. Super firm support of the Sawtooth Software conferences. And, early papers he wrote on conjoint analysis…not very many of you remember, but back in the 80s, people broke off into different camps. There was the discrete choice camp; there was the full profile camp; and there was the ACA camp. And people could become firmly entrenched in their camps. And, they’d point fingers and accuse the other camp of being wrong, or disingenuous. Joel was really good at being able to take it all in, do research, and write a paper showing the strengths and the weaknesses of different approaches in a very honest and legitimate way, so we could all open our eyes and embrace the best aspects of what each other was doing.

If you ever hear somebody stand up and say, “It’s my way or no way. I’ve got this right. Everybody else is wrong. There’s only one legitimate way to do it that’s going to work.” You should really be skeptical. There are different disciplines that bring different techniques and methods to the problems. The diversity sitting in this room, for example, is huge. You got some people formally coming out of the Bayesian camp, some people out of economics, some out of statistics, some from machine learning, and they’ve got different ways to crack these nuts. You shouldn’t be of thinking, “My way is the only right way.” Because there are multiple ways that can do a very good job. And, if you can’t agree, you can just ensemble it all, right?

I remember recently recruiting at a university, talking about conjoint analysis, and trying to sell these students on the idea of having a career at Sawtooth Software. A hand shot up and this idealistic young student asks, “What good does Sawtooth Software do in the world?” And I said, “I’ll give you three examples that I think will interest you regarding what good our company does in the world.”

Number one, my wife and I went out to buy a minivan some years ago. We looked at a number of different minivans. I was on the phone about three weeks later with Kyle Sziraki from Honda R&D. And I said, “Hey, my wife and I went out minivan shopping and we looked at the Honda Odyssey.” And he said, “What did you think?” And I said, “My wife absolutely went bonkers for the dual sliding doors controlled by the key fob.” Remember when that first came out it was really cool…this is many years back. And Kyle says to me, “Bryan, do you know why we put the dual sliding doors on the Honda Odyssey? It was due to a conjoint analysis study that we did.” And I thought that was remarkable and I thought to myself, “How many other products do I buy in my life that have been enhanced and made better through conjoint analysis?”

And then I gave two other examples. I related to this student that there was a PhD student named Stephanie Hime that asked for a grant for our software, which we approved. She later sent me a copy of her thesis on a DVD. Her thesis was—and you might snicker when I say this— that she was going to go down to the British Virgin Islands to study the impact that divers and snorkelers were having on the coral reefs and how to balance the benefits of tourism with the effect on the reefs. To back it up, her undergrad was in oceanography and biology. As a component of her research, they were actually—with permission—damaging little pieces of coral—typically just a few inches by a few inches—simulating kicks and other damage that could happen when people are inattentive when diving and snorkeling around the coral.

They’d come back weeks later to see how the coral was recovering or not. Also, they included a CBC study, interviewing tourists—divers and snorkelers—talking to them about the conditions of the reef, showing them sample pictures of what the reef could look under different levels of health: picture A, or B, or C. You can have this many fish, or that many fish. Here are all the prices and attributes of travel packages. They used the results of all this to advise the British government on how to manage the resources while also supporting the tourism component.

The third example that I found very satisfying while working at Sawtooth Software involved a professor named Margaret Kruk who has a team down in Tanzania that is consulting with the government. They are trying to improve and build health care clinics to deliver better health care for women, who are often out in the bush, for prenatal and postnatal care, to both improve their lives and the lives of their unborn and later their newborn children. They are using CBC. They’re not doing it online or on laptops. They don’t have the internet where they’re conducting the interviewing with these women. They were doing paper-and-pencil CBC surveys to try to figure out the drivers…how to advise the government regarding improving the features of these healthcare clinics to encourage these women to use them…because the government was providing these facilities for these women, but they were often not going to them. They just didn’t see the value.

So, it’s fascinating…sometimes we’re just trying to build a better mousetrap or to increase revenues, like we’ve seen in some presentations today. But, believe me—and many of you know this because you’ve been involved in such projects—these research techniques help the world in so many ways.

Others who have blessed me in my career development include Keith Chrzan who was my boss at one time back at IntelliQuest and who I’ve had the good fortune of being able to work with for the last four years. Karlan Witt and also Chris King.

In terms of recommendations going forward and thoughts, some of our challenges for the future include becoming better communicators about what these techniques do. As for me, being a better communicator about what conjoint analysis does. Knowing what and what not to present to clients. Leveraging market simulators and shares of preference probably more than showing part-worths and importances.

Regarding conjoint analysis, there were a number of problems that were alluded to in today’s sessions; for example, hypothetical bias and the craft of actually preparing, selecting, and educating respondents so they can give us good data. The industry lately has seemed to reward speed and cost savings above quality and care. Given the right tools, you can execute conjoint analysis in a matter of hours now. Design and field it in a matter of hours. Analyze it in another couple of hours. And, the next day be making a decision. Whereas, before it used to take weeks and months to do that. The panel sample business certainly is facilitating that. But with all of this speed and cost savings and all this innovation in technology, I think often we are just losing the good old-fashioned touch of knowing how to do the research right and having the courage to take the time to figure out how to do it right. There is a lot of craft that needs to go into preparing the questionnaire: pre-testing, sitting down with respondents and figuring out whether they understood your questionnaire…whether they are having trouble with any spots. These steps that we used to take are almost becoming a lost art at many firms, and it should not be.

Getting back to good old fashioned quality, plain honesty, openness, willingness to share, willingness to admit that you don’t know all the answers. These are virtues that people who have won the Parlin award before me have possessed. I am so grateful to receive this today.

I’m grateful to the Parlin committee that considered the work that I’ve done over the last twenty-some years. Although this is supposed to be a lifetime achievement award, you look at me and I’m not very old yet. I’ve got a lot more to do and so I’m just grateful that the work we’ve done to this point is recognized. I’m so thankful to receive this and for you to be here to share this moment with me. Thank you.