Summary and Conclusions
Table of Contents:
Learning directly from professional survey interviewers about how they do their jobs is a critical step in understanding how to develop realistic data collection models, improve question wording, and improve interviewer training. Most interviewers handled asking sensitive and difficult questions in similar ways, using techniques such as distancing, apologizing, and repeating the question. Despite being trained to read questions verbatim and use scripted probes (when provided) or neutral probes, interviewers reported using a variety of unscripted and non-neutral probes and other techniques when faced with sensitive or difficult questions and unexpected situations. Interviewers articulated the importance of being flexible and adapting to respondents in real-time, while maintaining rapport to help complete the interview, despite not receiving much formal training on these skills. The data from the interviews and vignettes illustrated the interview as a social interaction between the interviewers and respondents, which drives the need to maintain rapport over and above following the scripted questionnaire.
The demands of interviewing respondents (e.g., the ability to be flexible, conversational, and maintain rapport with respondents without compromising data quality) may have led to some of the inconsistencies in techniques used to handle sensitive and difficult questions and contexts that were observed across interviewers. Some interviewers noted that they make it apparent at the beginning of an interview that respondents can skip any questions they do not want to answer; some used distancing techniques; and others used non-scripted lead-ins or selective reading of response options. Most interviewers reported that they probe when respondents have comprehension or recall problems, concerns about confidentiality, or uneasiness with a particular topic. Interviewers decided not to probe when respondents had limited time, were reluctant to participate, or the question topic was highly sensitive.
Interviewers reported using a wide range of probing strategies. Some modified the question or tailored their lead-ins with information respondents previously provided. Others changed scripted probes, read a subset of response options, changed open-ended questions to closed-ended questions, and shortened questions, while a minority did little probing at all. In response to vignettes depicting realistic survey scenarios, interviewers often reported they used a conversational approach, unscripted probes, tailored question lead- ins, displays of empathy, or distanced themselves from the survey as needed to maintain rapport and complete the interview. Although these changes to the interview script are well-intentioned, they may affect measurement error and survey estimates. The vignettes were a useful tool to understand how interviewers react to unexpected survey contexts, including sensitive or difficult interactions, and shed light on interviewers' decision-making processes when facing these challenges.
Recommendations for Interviewer Training
This research identified several ways that survey organizations could improve support and training for interviewers. Based on these findings, it is recommended that survey organizations focus more training on topics such as probing sensitive or difficult questions. Training materials should help interviewers understand why a question is included in the survey and how to effectively explain complex concepts to respondents. Many interviewers expressed frustration with not only getting respondents to participate in the first place, but then having to administer questions that they themselves find uncomfortable or lack confidence in asking. It is also recommended that survey organizations include interviewers in final questionnaire decisions and throughout the questionnaire development process. Additionally, in the absence of understanding the value and purpose of survey questions, interviewers may alter the meaning of the original question or probe in non- standardized ways, contributing to measurement error. Training on the purpose of each survey question and how to effectively handle sensitive or difficult survey questions is likely to improve how interviewers administer these questions.
Recommendations for Future Research
Conducting research directly with interviewers is critical to improving our understanding of the complex nature of interviewer-respondent interactions. Systematic research on interviewers' decision-making, cognitive processes, and probing in sensitive and difficult survey contexts in particular is needed in the survey methods literature. New technologies such as CARI recordings make this type of research possible and enable organizations to see how interviewers actually administer questions in the field, which can help organizations identify and improve problematic questions.
Many of the themes identified in this research related to inconsistent probing across interviewers. This is problematic, as interviewer effects are often tied to the rate at which respondents provide inadequate answers that require probing (Mangione, Fowler, and Louis 1992; West and Blom 2017). Future research should investigate when and why interviewers deviate from the script so that survey organizations can provide interviewers with the tools to respond to these concerns consistently and in a manner that can be measured and monitored. For instance, conducting research on the effectiveness of building scripted probes directly into survey instruments could prove beneficial. The use of scripted probes has the potential to help better standardize survey interviews, increase data quality, and reduce measurement error. It is also recommended that researchers make use of vignettes for evaluating questions that may be problematic. Question pretesting is often focused solely on potential problems with respondent processing, but question pretesting with interviewers using vignettes could improve survey design - including question wording and probes. Investments in interviewer training and research conducted directly with survey interviewers are likely to have a positive impact on survey administration in the future, and help to reduce the total survey error associated with sensitive and difficult survey questions.
Conrad, F. G., and M. F. Schober. 2000. Clarifying question meaning in a household telephone survey. Public opinion quarterly 64(l):l-28.
Dykema, J., and N. C. Schaeffer. 2005. An investigation of the impact of departures from standardized interviewing on response errors in self-reports about child support and other family- related variables. Paper Presented at the Annual Meeting of the American Association for Public Opinion Research, Miami, FL.
Fowler, F.J., and T. W. Mangione. 1990. Standardized survey interviewing: Minimizing interviewer-related error. Newbury Park, CA: Sage Publications.
Glaser, B. G., and A. L. Strauss. 2017. Discovery of grounded theory: Strategies for qualitative research. New York, NY: Routledge, CRC Press/Taylor & Francis.
Groves, R. M., and M. P. Couper. 1998. Nonresponse in household interview surveys. New York, NY: John Wiley & Sons.
Flaan, M., Y. Ongena, and M. Huiskes. 2013. Interviewers' questions: Rewording not always a bad thing. In Interviewers' deviations in surveys: Impact, reasons, detection and prevention, ed. P. Winker, N. Menold, and R. Porst, 173-193. Frankfurt on the Main: Peter Lang Academic Research.
Floutkoop-Steenstra, H., and J. P. Houtkoop-Steenstra. 2000. Interaction and the standardized survey interview: The living questionnaire. Cambridge, UK: Cambridge University Press.
Japec, L. 2008. Interviewer error and interviewer burden. In Advances in telephone survey methodology, ed. E. de Leeuw, L. Japec, P. J. Lavrakas, M. W. Link, and R. L. Sangster, 185-211. Hoboken, NJ: John Wiley & Sons.
Mangione, T. W., F. J. Fowler, and T. A. Louis. 1992. Question characteristics and interviewer effects. Journal of official statistics 8:293-293.
Naher, A. F., and I. Krumpal. 2012. Asking sensitive questions: The impact of forgiving wording and question context on social desirability bias. Quality & quantity 46(5):1601-1616.
Olson, K., J. D. Smyth, and B. Cochran. 2018. Item location, the interviewer -respondent interaction, and responses to battery questions in telephone surveys. Sociological methodology 48(l):225-268.
Olson, K., J. D. Smyth, and A. Ganshert. 2019. The effects of respondent and question characteristics on respondent answering behaviors in telephone interviews. Journal of survey statistics and methodology 7(2):275-308.
Ongena, Y. P, and W. Dijkstra. 2006. Methods of behavior coding of survey interviews. Journal of official statistics 22:419-451.
Ongena, Y. P, and W. Dijkstra. 2007. A model of cognitive processes and conversational principles in survey interview interaction. Applied cognitive psychology: The official journal of the society for applied research in memory and cognition 21(2):145—163.
Peter, J., and P. M. Valkenburg. 2011. The impact of "forgiving" introductions on the reporting of sensitive behavior in surveys: The role of social desirability response style and developmental status. Public opinion quarterly 75(4):779-787.
Sander, J. E., F. G. Conrad, P. A. Mullin, and D. J. Herrmann. 1992. Cognitive modelling of the survey interview. In Proceedings of the Joint Statistical Meetings, Survey Research Methods Section, 818-823.
Schaeffer, N. C, J. Dykema, D. Garbarski, and D. W. Maynard. 2008. Verbal and paralinguistic behaviors in cognitive assessments in a survey interview. In Proceedings of the Joint Statistical Meetings, Survey Research Methods Section, 4344-4351.
Schober, M. E, and F. G. Conrad. 1997. Does conversational interviewing reduce survey measurement error? Public opinion quarterly 61:576-602.
Tourangeau, R., L. J. Rips, and K. Rasinski. 2000. The psychology of survey response. Cambridge, UK: Cambridge University Press.
Tourangeau, R., G. Shapiro, A. Kearney, and L. Ernst. 1997. Who lives here? Survey undercoverage and household roster questions, journal of official statistics 13(1):1—18.
West, В. T., and A. G. Blom. 2017. Explaining interviewer effects: A research synthesis. Journal of survey statistics and methodology 5(2):175-211.
West, В. T., F. G. Conrad, F. Kreuter, and F. Mittereder. 2018. Can conversational interviewing improve survey response quality without increasing interviewer effects? Journal of the royal statistical society: series A (statistics in society) 1:181-203.