CaltechTHESIS
  A Caltech Library Service

Three Essays on Survey Methods and their Applications to Measuring Political Behavior and Attitudes

Citation

Li, Yimeng (2022) Three Essays on Survey Methods and their Applications to Measuring Political Behavior and Attitudes. Dissertation (Ph.D.), California Institute of Technology. doi:10.7907/32kq-yy36. https://resolver.caltech.edu/CaltechTHESIS:06062022-181748014

Abstract

In this thesis, I develop survey methods and apply them to measure political behavior or attitudes more accurately. Two of the challenges to researchers in measuring political behavior or attitudes are respondents’ reluctance to respond to sensitive questions truthfully and respondents' inattention to providing accurate responses. I contribute to advancing survey methodology to tackle these challenges.

Eliciting truthful answers from respondents on sensitive issues is a difficult problem in surveys, and list experiments emerge as the most popular indirect questioning technique to do so among political scientists and sociologists. The analysis of list experiments depends on two assumptions, known as "no design effect" and "no liars." The no liars assumption is strong and may fail in many list experiments. In Chapter II (published in Political Analysis), I relax the no liars assumption and develop a method to provide bounds for the prevalence of sensitive behaviors or attitudes under a weaker behavioral assumption. I apply the method to a list experiment on the anti-immigration attitudes of California residents and a broad set of existing list experiment datasets. My results indicate that the bounds tend to be narrower when the list consists of items of the same category, such as multiple groups or organizations, different corporate activities, and various considerations for politician decision-making. The contribution of my paper is to illustrate when the full power of the no liars assumption is most needed to pin down the prevalence of the sensitive behavior or attitudes, and to facilitate analysis of list experiments robust to violations of the no liars assumption.

Over the past two decades, the environment in which respondents participate in surveys and polls has changed, with shifts from interviewer-driven to respondent-driven surveying, and from probability to nonprobability sampling. One consequence of these technological changes is that survey respondents in these environments may be less attentive to survey questions. In Chapter III (published in Political Analysis), co-authored with R. Michael Alvarez, Lonna Atkeson, and Ines Levin, we study respondent attention and its implications using data from a self-completion online survey that identified inattentive respondents using instructed-response items (IRIs), a simple attention check that received little scholarly attention. Our results demonstrate that ignoring attentiveness provides a biased portrait of the distribution of critical political attitudes and behavior of both sensitive and more prosaic nature, and results in violations of key assumptions underlying experimental designs. We discuss four approaches to dealing with inattentiveness in surveys and when these approaches are appropriate.

Attention checks, in the form of instructional manipulation checks (IMCs) or instructed response items (IRIs), are useful tools for survey quality control. However, due to the lack of ground truth information, these previous works rely on various post hoc measures to evaluate the performance of attention filters. For the same reason, it has also been impossible to evaluate the performance of different statistical approaches to dealing with inattentive respondents. In Chapter IV, co-authored with R. Michael Alvarez, we conduct a first validation study by analyzing a large-scale post-election survey following the November 2018 General Election and validating survey responses at the individual level using administrative records. Our results show that for each type of attention check, respondents failing the check provided responses with lower accuracy than respondents passing it. We compare the performance of different approaches to dealing with inattentive respondents in the study of turnout and voting method, two variables of substantive interest that are available from the administrative record, and conclude that the best strategy depends on a bias-variance trade-off that also accounts for the correlation between respondent attention and the outcome variables of interest.

Item Type:Thesis (Dissertation (Ph.D.))
Subject Keywords:political methodology, survey methods, political behavior
Degree Grantor:California Institute of Technology
Division:Humanities and Social Sciences
Major Option:Social Science
Thesis Availability:Public (worldwide access)
Research Advisor(s):
  • Alvarez, R. Michael
Thesis Committee:
  • Katz, Jonathan N. (chair)
  • Alvarez, R. Michael
  • Hirsch, Alexander V.
Defense Date:26 April 2022
Record Number:CaltechTHESIS:06062022-181748014
Persistent URL:https://resolver.caltech.edu/CaltechTHESIS:06062022-181748014
DOI:10.7907/32kq-yy36
Related URLs:
URLURL TypeDescription
https://www.cambridge.org/core/journals/political-analysis/article/relaxing-the-no-liars-assumption-in-list-experiment-analyses/C0296899265E94123B30C5CBDF65B51BPublisherArticle adapted for ch. 2
https://www.cambridge.org/core/journals/political-analysis/article/paying-attention-to-inattentive-survey-respondents/BEDA4CF3245489645859E7E6B022E75APublisherArticle adapted for ch. 3
https://preprints.apsanet.org/engage/apsa/article-details/61a901af704d057d023da5cfOrganizationArticle adapted for ch. 4
ORCID:
AuthorORCID
Li, Yimeng0000-0003-3855-0756
Default Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:14950
Collection:CaltechTHESIS
Deposited By: Yimeng Li
Deposited On:07 Jun 2022 15:25
Last Modified:04 Aug 2022 21:55

Thesis Files

[img] PDF (PDF file of thesis) - Final Version
See Usage Policy.

2MB

Repository Staff Only: item control page