Online survey-based methods are increasingly used to elicit preferences for healthcare. This digitization creates an opportunity for interactive survey elements, potentially improving respondents’ understanding and/or engagement.
Our objective was to understand whether, and how, training materials in a survey influenced stated preferences.
An online discrete-choice experiment (DCE) was designed to elicit public preferences for a new targeted approach to prescribing biologics (“biologic calculator”) for rheumatoid arthritis (RA) compared with conventional prescribing. The DCE presented three alternatives, two biologic calculators and a conventional approach (opt out), described by five attributes: delay to treatment, positive predictive value, negative predictive value, infection risk, and cost saving to the national health service. Respondents were randomized to receive training materials as plain text or an animated storyline. Training materials contained information about RA and approaches to treatment and described the biologic calculator. Background questions included sociodemographics and self-reported measures of task difficulty and attribute non-attendance. DCE data were analyzed using conditional and heteroskedastic conditional logit (HCL) models.
In total, 300 respondents completed the DCE, receiving either plain text (n = 158) or the animated storyline (n = 142). The HCL showed the estimated coefficients for all attributes aligned with a priori expectations and were statistically significant. The scale term was statistically significant, indicating that respondents who received plain-text materials had more random choices. Further tests suggested preference homogeneity after accounting for differences in scale.
Using animated training materials did not change the preferences of respondents, but they appeared to improve choice consistency, potentially allowing researchers to include more complex designs with increased numbers of attributes, levels, alternatives or choice sets.