I considered using Sparse Max Diff (showing each item once) but decided against it because it will result in more max diff exercises per respondent than I'm comfortable with given respondent fatigue.

Instead I'm considering using Express Max Diff (where each respondent completes max diff with a randomly selected subset of 30 items from the total 200 items) - BUT am not sure of the steps to design/analyze in LightHouse Studio given that it will be programmed/hosted by an external vendor. What would be the procedure? - for example:

(1) In LightHouse Studio, create a max diff design for 30 items per the usual process with, say, 100 versions. Output the design to .csv.

(2) Send this .csv design to the external programming vendor along with the full list of 200 items AND tell the vendor to randomly select 30 items (from the 200 item list) to use for each respondent's 30 max diff items.

(3) I don't know what to do in terms of:

(a) What max diff data to get from the vendor.

(b) how to read this data into LightHouse Studio via the paper-and-pencil Accumulated Data Template File for HB analysis.

(c) how to run HB in LightHouse Studio where it will impute scores for items not asked in each respondent's max diff - so that I end up with respondent-level max diff scores for all 200 items.

Thank you.