The Impact of Pictures on Best-Worst Scaling in Web Surveys

Open access

Abstract:

Motivation and burden are two of the most important aspects that influence response rates and dropouts in online surveys. As a result, we focus our analyses on how pictures and Best Worst Scaling (BWS), two solutions for each problem, interact in the Web medium. We use an experimental design that compares a BWS with pictures, the experimental group, and BWS without pictures, the control group. Results show that pictures influence measurement of BWS in six out of 16 items. We also observe that Couper's (2001) conclusion that concordant text and images have an accentuation effect while a discordant relationship between the two has an interference impact is partly true in our data. Eight out of the 16 items are at least partially influenced by the concordant/discordant variable while four fully respect this model. We conclude by discussing the impact of our findings and its limitations.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • Cook C. Heath F. & Thompson R. L. (2000). A Meta-Analysis of Response Rates in Web-or Internet-Based Surveys. Educational and Psychological Measurement60(6) 821–836. doi:10.1177/00131640021970934

  • Couper M. P. (2000). Review: Web surveys: A review of issues and approaches. The Public Opinion Quarterly64(4) 464-494.

  • Couper M. P. (2001). Web surveys: The questionnaire design challenge. Proceedings of the 53rd session of the ISI. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.99.3436&rep=rep1&type=pdf

  • Couper M. P. Conrad F. G. & Tourangeau R. (2007). Visual Context Effects in Web Surveys. Public Opinion Quarterly71(4) 623–634. doi:10.1093/poq/nfm044

  • Couper M. P. Tourangeau R. & Kenyon K. (2004). Picture This!: Exploring Visual Effects in Web Surveys. Public Opinion Quarterly68(2) 255-266. doi:10.1093/poq/nfh013

  • Crawford S. D. Couper M. P. & Lamias M. J. (2001). Web Surveys: Perceptions of Burden. Social Science Computer Review19(2) 146–162. doi:10.1177/089443930101900202

  • Dillman D. A. & Smyth J. D. (2007). Design Effects in the Transition to Web-Based Surveys. American Journal of Preventive Medicine32(5) S90-S96. doi:10.1016/j.amepre.2007.03.008

  • Dillman D. A. Tortora R. D. Conradt J. & Bowker D. (1998). Influence of Plain Vs. Fancy Design on Response Rates for Web Surveys. Proceedings of the Survey Research Methods Section American Statistical Association 1998.

  • Fan W. & Yan Z. (2010). Factors Affecting Response Rates of the Web Survey: A Systematic Review. Computers in Human Behavior26(2) 132–139. doi:10.1016/j.chb.2009.10.015

  • Finn A. & Louviere J. J. (1992). Determining the Appropriate Response to Evidence of Public Concern: The Case of Food Safety. Journal of Public Policy & Marketing11(2) 12-25.

  • Galesic M. (2006). Dropouts on the Web: Effects of Interest and Burden Experienced During an Online Survey. Journal of Official Statistics22(2) 313.

  • Göritz A. S. (2006). Incentives in Web studies: Methodological issues and a review. International Journal of Internet Science1(1) 58-70.

  • Kahle L. R. Beatty S. E. & Homer P. (1986). Alternative measurement approaches to consumer values: the list of values (LOV) and values and life style (VALS). Journal of consumer research13(3) 405-409.

  • Kivu M. (2010). Long Questionnaires: Impact on Abandon Rate (1-5). Bucharest: IPSOS - Romania.

  • Lee J. A. Soutar G. & Louviere J. (2008). The Best–Worst Scaling Approach: An Alternative to Schwartz's Values Survey. Journal of Personality Assessment90(4) 335-347. doi:10.1080/00223890802107925

  • Louviere J. J. (1988). Conjoint Analysis Modelling of Stated Preferences: A Review of Theory Methods Recent Developments and External Validity. Journal of Transport Economics and Policy 93-119.

  • Mahon-Haft T. A. & Dillman D. A. (2010). Does Visual Appeal Matter? Effects of Web Survey Aesthetics on Survey Quality. Survey Research Methods4 43-59.

  • Manfreda K. & Vehovar V. (2002). Survey design features influencing response rates in web surveys. The International Conference on Improving Surveys Proceedings. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.87.515&rep=rep1&type=pdf

  • Marley A. A. J. & Louviere J. J. (2005). Some Probabilistic Models of Best Worst and Best–Worst Choices. Journal of Mathematical Psychology49(6) 464-480. doi:10.1016/j.jmp.2005.05.003

  • Moore W. L. Jason Gray-Lee & Louviere J. J. (1998). A Cross-Validity Comparison of Conjoint Analysis and Choice Models at Different Levels of Aggregation. Marketing Letters9(2) 195-207.

  • Orme B. (2000). Hierarchical Bayes: Why all the Attention? Sawtooth Software Research Paper Series 1-7.

  • Sproull L. Subramani M. Kiesler S. Walker J. H. & Waters K. (1996). When the Interface is a Face. Human-Computer Interaction11(2) 97-124.

  • Tavares S. Cardoso M. & Dias J. G. (2010). The Heterogeneous Best-Worst Choice Method in Market Research. International Journal of Market Research52(4) 533. doi:10.2501/S1470785309201430

  • The American Association for Public Opinion Research. (2011). Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys. 7th edition.

  • Toepoel V. & Couper M. P. (2010). Can Verbal Instructions Counteract Visual Context Effects in Web Surveys? Public Opinion Quarterly75(1) 1-18. doi:10.1093/poq/nfq044

  • Winters L. (1989). SRI Announces VALS 2. Marketing Research1(2) 67.

Search
Journal information
Metrics
All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 137 47 0
PDF Downloads 68 36 0