Enumerator Experiences in Violent Research Environments

Understanding political and social effects of violence in local populations through public opinion surveys has become increasingly common internationally. Yet while researchers are attuned to possible challenges induced during survey implementation, this work focuses almost uniformly on respondents. This paper considers survey enumerators as critical actors for data collection in violent research settings. We present survey results from 245 enumerators in Côte d’Ivoire to show that their personal feelings of insecurity and exposure to violence while conducting surveys may condition challenges faced and compromises made to gathering data. We shed light on how academic research in violent political settings poses unique security concerns for enumerators, with ramifications for data integrity.


Introduction
In the wake of a sharp uptick in academic surveys (Lupu & Michelitch, 2018) and field experiments (Baldassarri & Abascal, 2017) run in the Global South, social scientists have initiated a number of necessary conversations about our ethical obligations to our research participants (e.g., Fujii, 2012).These conversations have been particularly pronounced among scholars working in insecure settings, where concerns revolve around how to best approach sensitive topics, either with an eye towards avoiding re-traumatizing respondents (e.g., Moss et al., 2019) or out of recognition that such discussions can in fact be cathartic (Jaffe et al., 2015;Wood, 2006).All this research depends on locally recruited enumerators and research assistants.Yet despite recent nods to a need to think through the risks faced by field staff in violent contexts (Cronin-Furman & Lake, 2018; Kaplan et al., 2020), we know relatively little about the challenges that local research teams face while running academic surveys.
In this paper, we take up this task to ask, what are the experiences of survey enumerators working in insecure settings?Results from an original survey of 245 enumerators in Côte d'Ivoire reveal two core findings.First, we find that enumerators face what are at times significant difficulties during fieldwork.Participants in our survey detail a range of challenges from local actors, such as having their intentions questioned by local authorities to more active forms of hostility.Interestingly, enumerators report that it is survey respondents themselves who are most likely to pose a challenge, suggesting that we take seriously research participant agency in shaping the research experience.
Most concerningly, our results suggest that enumerators working in insecure settings often experience or are otherwise fearful of violence.We should not underestimate the rates at which enumerators encounter violence on the job: among those we surveyed, 58% reported having experienced or witnessed some form of violence or threat during fieldwork.In contrast to the prevailing focus on how research encounters are shaped by respondents' wartime experiences, we find that enumerators believe their own experiences with personal insecurity during data collection are more consequential for their work lives than the attributes of the respondents they interview.
Our second core finding is that enumerators working in violent contexts break research protocols in ways researchers may not fully anticipate.We present evidence that enumerators who experience personal insecurity while working-meaning that they have felt generally unsafe or have witnessed or experienced violence during survey work-and/or who work in more violence-affected communities are more likely to report breaks to survey protocols in order to facilitate data collection either by themselves or their colleagues.Enumerators exposed to violence also report difficulties with academic survey questions, indicating that questions can be culturally insensitive and psychologically challenging for enumerators themselves.
We lay out two clear ramifications of enumerators' work experiences for researchers working in insecure settings.The first is ethical.Survey enumerators are not immune from insecurity in the environments they work.
Although this reality is increasingly being noted (Baron & Young, 2021;Cronin-Furman & Lake, 2018), this work does not focus explicitly on enumerators' experiences, leaving us uninformed about the scope of risks that enumerators face during data collection.By documenting the reported experiences of the enumerators we sampled in Côte d'Ivoire, we offer an initial step towards systematically inventorying challenges enumerators face in the field in what we hope will be a larger conversation.Accordingly, we seek to expand the scope of ethical concerns that researchers need to weigh when conducting work in insecure settings: our focus on protecting research participants may lead us to overlook ways in which respondents themselves can pose harms to enumerators.Our ethical obligations must be to both populations.
The second key takeaway from our survey relates to data quality.Across a range of questions, enumerators report that they or other enumerators routinely deviate from research protocols.The challenges enumerators reported with projects they work onfrom the nature of survey questions to the sensitive timing of implementationas well as their reported willingness to avoid difficult respondents and adapt survey questions, are often heightened when they have experienced personal insecurity or worked frequently in violence-affected communities.These choices can be consequential.Enumerators' own apprehension about threatening encounters during fieldwork may screen out certain types of respondents if they systematically choose more amenable respondents or households.While we cannot test for these biases in our own data, we lay out a series of pressing questions our findings raise for survey research in insecure settings below.
Cumulatively, this paper carries a clear message for researchers across the social sciences: the challenges of running surveys in insecure environments cannot be resolved by clever research designs or question wording alone.Survey enumerators are a linchpin in the data collection process, and despite technological advances in our ability to monitor enumerators in the field, they continue to make a range of unobserved decisions that shape the data we receive.Throughout data collection-from how they present themselves to the way they explain or rephrase complicated questions-survey enumerators engage in brokerage that remains largely unacknowledged and unaccounted for by researchers in political science.If, as we document, enumerators who have personally experienced insecurity on the job systematically resolve these challenges differently, we risk missing or misunderstanding potential biases in our data if we fail to recognizeor at a minimum acknowledgethese realities.
Below, we summarize the literature on enumerator effects and research brokers in insecure settings.This scholarship recognizes the critical role played by local research assistants, but its focus remains on the response of research subjects on the one hand and the ethical challenges faced by local researchers on the other.We then highlight unaddressed concerns about security challenges enumerators face in the field and layout implications that these obstacles hold for the data collection process.Drawing on descriptive analysis and a set of qualitative, open-ended questions, we document challenges faced by enumerators and associated consequences for data collection.We conclude with a discussion of the main implications that our results suggest for researchers working in insecure settings.

Studying Survey Enumerators in the Field
Political scientists have hardly neglected the role of survey enumerators, but their focus has largely concentrated on enumerator-respondent relationships.The most significant body of work focuses on enumerator characteristics, or how enumerator attributes shape the responses they elicit.An enumerator's gender (Harris & van der Windt, 2023), religion (Benstead, 2014;Blaydes & Gillum, 2013), ethnicity (Adida et al., 2016;Dionne, 2014), and experience (Di Maio & Fiala, 2020) have all been found to influence respondent answers by priming social desirability bias or deference to those of a higher social status.These cues may be subtle-a study on polling failure in Nicaragua found that respondents inferred perceived partisanship of a poll from the color of the interviewer's pen, biasing their responses, for example (Bischoping & Schuman, 1992)-and these effects may be amplified with sensitive questions (Blair et al., 2020).
A second stream of research has examined the role of enumerator actions.Enumerators have been argued to elicit responses, accelerate surveys, or adapt their approach to certain questions as they develop priors over the course of interviews (Olson & Peytchev, 2007).Perhaps the biggest concern has been the risk of fabricated data, as enumerators are often incentivized to maximize the number of surveys completed (see Lupu & Michelitch, 2018).
Concerns about enumerator characteristics and actions are compounded when we move to insecure environments.Work in anthropology and adjacent fields emphasizes how local research brokers are critical in facilitating research access in conflict settings, yet have inherently unequal relationships with the researchers that employ them.Local research brokers provide advice, data, and security (Boas, 2020;Cirhuza, 2020;Utas, 2019), while facing numerous challenges during data collection, ranging from their material comfort to their emotional and physical safety (see Baaz & Utas, 2019;Paluck, 2009).Concurrently, their unique positionality poses specific obstacles, such as working in contexts where one is viewed as a spy (Kadetwa, 2019) and the reality that local brokers face threats to their own safety, including harassment by local authorities, being followed and having data stolen (Mwambari, 2019).
We build on this literature to focus on enumerator safety.While conflict researchers have reflected on the risks they themselves face in the field (e.g., Loyle & Simoni, 2017;Steinart et al., 2021), the obvious yet frequently underappreciated reality is that survey enumerators are asked to work in the same violent settings.Attention to local research staff safety has only recently received recognition in political science (e.g., Sangaré & Bleck, 2020), with the most concerted attention focused on research ethics, as scholars recognize that locally recruited research teams have often experienced the same violence as subject populations and may face duress during interviews (Baron & Young, 2021).
The question of local research staff safety is particularly concerning because, as noted by Baaz & Utas (2019), Global North research institutions are increasingly worried about the safety of their own staff and researchers, increasing our reliance on local research partners.This raises the cost of the fact that we lack a solid grasp on the frequency and range of risks faced by survey enumerators.In contrast to the more intimate relationships between a researcher and an individual research assistant or fixer documented in work on research brokers (Caretta, 2015;de Guevara & Bøås, 2020;Syahar & Soedirgo, 2019), survey research poses a specific set of challenges by creating more distance between researchers and the data collection process.Survey research-particularly when managed by a firm-involves a larger array of actors, decentralized across space in different enumeration areas.As a result, questions of enumerator safety have proven harder to reflect on or, perhaps more cynically, easier to ignore.
Local research teams' experiences during fieldwork additionally hold a host of consequences for collected data.Survey enumerators engage in extensive acts of brokerage: they must navigate between standardized research protocols and local cultural expectations, engage in elaborate acts of translation to render survey objectives legible to local populations, and interface with local authorities to obtain informal research clearance.Enumerators make on-the-ground decisions about what households to survey.They ask questions and record answers, at times engaging in substantial interpretation that may generate measurement error (West & Blom, 2017).They make choices about how to present themselves, stressing various organizational or political affiliations to convince respondents to participate (e.g., Paluck, 2009) as they navigate between survey protocols and cultural expectations (see Himelein, 2016).
All of these issues are prone to additional stress when working in a insecure setting.Castorena et al. (2021) suggest that the extremely high number of fabricated surveys they discovered in the 2016 Venezuelan LAPOP survey was driven by the high degree of insecurity and unrest that enumerators faced.Tense political environments may amplify the degree to which brokers' own positionality and political views shape how they approach respondents and study instruments (Baaz & Utas, 2019;Jenkins, 2018).Doing so requires us to consider political volatility and everyday violence faced by both our respondents and our enumerators (Wilson, 2018).
We see three core, unanswered questions emerging from this work that we engage below.What is the scope of challenges faced by enumerators when conducting survey research in insecure settings?What strategies do enumerators use to overcome these challenges?How might these strategies potentially compromise the data they collect?

Research Design
To explore these questions, we surveyed enumerators in Côte d'Ivoire. 1Côte d'Ivoire experienced a civil war from 2002-2007 and saw renewed fighting in 2010-2011.Although the country has been considered post-conflict since then, land-related violence and flare-ups of electoral violence in 2015 and 2020 mean that the domestic political environment has often remained quite tense.Côte d'Ivoire's history of political violence is similar to other contexts in Africa: over 70% of civil wars have involved rebels taking over and controlling territory (Huang, 2016), while between one-third and half of elections have had associated violence since the 1990s (Burchard, 2015;Taylor et al., 2017).Indeed, Afrobarometer data from 34 countries (2019) suggests that Côte d'Ivoire scores around or above average on measures of insecurity, which include citizens who report feeling unsafe, having been attacked or experiencing crime, and who rate insecurity as an important issue for their government to address (See Figure A1 in the Online Appendix).In short, Côte d'Ivoire does not represent an outlier on many dimensions of insecurity and conflict.We may even be underestimating the impact of insecure research settings; higher shares of respondents report feeling unsafe and that insecurity is the most important political issue in countries like Madagascar, South Africa, or Mali.
Research in political science that utilizes face-to-face surveys in Côte d'Ivoire cover diverse topics but focus primarily on assessing citizen experiences with violence.For example, McCauley (2014) used surveys and implicit association tests to examine religious and ethnic bias while Martin et al. (2021) surveyed communities that were under rebel control during the war to understand citizen behavior.Smidt (2020) conducted surveys to understand citizen perceptions of violence and appreciation for UN peacekeeper education projects.The Afrobarometer has been fielded four times (Rounds 5-8, 2013(Rounds 5-8, , 2015(Rounds 5-8, , 2017(Rounds 5-8, and 2019) ) in the country and, with the exception of Round 7, has asked specific questions about violence.Rounds 5 and 6 asked about preferences over reconciliation and Rounds 6 and 8 were fielded in advance of major elections, asking whether respondents believed elections would be peaceful and what measures would be effective to ensure their peacefulness.Still, Côte d'Ivoire is not among the most popularly surveyed countries in Africa.A Google Scholar search reveals Côte d'Ivoire ranked below Mali, Niger and Cameroon, countries of comparable populations, in political science publications (See Table A5 in the Online Appendix).
In short, even though Côte d'Ivoire is not the most saturated research setting, it has seen a fair amount of survey research, and given its history with violent conflict, these surveys have often asked respondents about violence.As a result, Côte d'Ivoire is a compelling case for exploring enumerator experiences both because it has a relatively well-developed survey market and because its history of conflict and violence creates a setting in which enumerators are likely to have faced the types of challenges documented in the previous section, including violence exposure, sensitive survey topics, and insecurity during the data collection process.
Our ideal sampling frame would be all enumerators who have conducted academic surveys in Côte d'Ivoire, including those who may have quit or who do not work for a formal firm.Unfortunately, this ideal frame would be nearly impossible to construct.Instead, we contacted three prominent survey firms in Abidjan, the economic capital, to see if they would be willing to advertise our survey to their enumerator pool.The three firms-the Center for Research and Training on Integrated Development (CREFDI), Innovations for Poverty Action (IPA), and IPSOS-facilitated access to enumerators who work for them. 2 These firms were selected due to their prominence in political research: CREFDI is the Afrobarometer operating partner, IPA is an internationally renowned organization that conducts projects for social scientists, and IPSOS is the largest public opinion firm operating in Côte d'Ivoire.We recruited participants by sending either a WhatsApp message or an email with a unique code for accessing a survey on Qualtrics.As an incentive to participate, all respondents were entered into a drawing to win one of four Samsung tablets (valued at $150).In total, we contacted 371 enumerators, 248 of which completed our survey, yielding a response rate of 67%.We confirm that we do not have differential nonresponse by gender or ethnicity in the Online Appendix. 3 We recognize that this sampling method may introduce bias, and thus caution interpretation of our results as fully generalizable.However, we have reason to think that these results may be representative of enumerator experiences in Côte d'Ivoire.First, we advertised the survey as one in which they would simply share their experiences and that results would be anonymous; we have no reason to believe that enumerators would opt out because of fear of retribution by their employer.Second, several respondents indicated that they had worked for multiple survey firms as well as on independent research projects, suggesting that we may be capturing a large share of the enumerator pool.Finally, it is possible that enumerators with particularly negative experiences with data collection may have quit the firms altogether, and thus would not appear in our sampling frame.However, even if this were the case, it would mean we are underestimating the magnitude of exposure to violence by this population as we show that close to 60% of our sample report experiencing or fearing violence.
Participants were asked to complete a survey that took approximately twenty to thirty minutes.Specific question wording is introduced below, but questions were clustered in five blocks: demographic information, experiences working as an enumerator, challenges faced while implementing projects, challenges encountered as a function of survey content, and solutions employed when addressing these challenges.The survey also included a handful of open-ended questions.Summary statistics on respondent demographics are presented in Table 1.Descriptive statistics for all variables in this paper can be found in the Online Appendix.
Of note, our data are collected at the enumerator level, and not the survey or project level.Ideally, we would have collected data at the survey/project level, which we encourage researchers to do below in our Implications section.We have no way of knowing, for example, if every project an enumerator undertook involved insecurity or data collection difficulties.We intentionally asked them to think of the sum of their experiences to better understand the scope and magnitude of challenges they have faced while conducting research.We suggest in the Implications section ways in which future researchers may address these concerns.as 1 = 1, 2 = 2-5, 3 = 6-10, 4 = 10-15, 5 = 16-20, 6 = more than 20.
The average enumerator in our sample is highly educated, with over 90 percent completing at least some university.Most of enumerators were male, which is reflective of the pool (see sampling frame demographics in the Online Appendix). 4The largest ethnic group in Côte d'Ivoire, the Akan, is well represented within the enumerator sample, while northern groups (North Mandé and Voltaïque) are in the minority (they make up 11% and 10%, respectively, in the general population).The sample is overwhelmingly Christian, and the vast majority do not explicitly support the ruling party.Although most respondents hail from Abidjan, 60% of those respondents had worked in at least one department 5 outside of Abidjan (maps of where enumerators worked and live can be found in Figure A3 in the Online Appendix).Over half of the sample had worked on at least one project about politics, peace, or conflict (additional descriptive statistics of topics covered can be found in Table A9 in the Online Appendix).

Enumerator Challenges in the Field
We begin with descriptive analysis of the high rates of personal insecurity during data collection and the significant challenges faced by enumerators, before presenting protocol breaches as a solution to contend with these challenges.

Enumerators Experience Significant Personal Insecurity During Fieldwork
Enumerators report strikingly high rates of personal insecurity during survey work.Almost 60% of enumerators felt unsafe at least sometimes while conducting academic surveys, while over 75% reported fearing for their physical safety in their assigned regions.Of the 40% of enumerators who considered leaving their work due to challenges they faced, 20% said they considered leaving because they feared for their physical safety.Indeed, this was the fourth most common response, after insufficient pay (77%), too long of a time commitment (26%), and personal issues (25%).As a comparison, only eight percent of enumerators considered leaving because of sensitive survey topics.Forty-two percent of enumerators who said they would leave because they feared for their safety reported actually having left a job due to those challenges.
Personal Experience with Violence.We use four survey questions to capture enumerators' experiences with violence during fieldwork, both about their own personal experiences of fears of violence as well as whether they worked in violence-affected communities.
We asked: 1. Physical safety: How often have you feared physical safety in your assigned region?Possible responses on a Likert scale, from 0 to 4: never; sometimes; about half the time; most of the time; or always.2. Felt unsafe: How often have you felt unsafe while conducting surveys?
Responses were on the same scale.3. Experienced violence: Have you ever faced the following situations when collecting data?If so, please indicate how frequently.Respondents answer whether they have been followed, robbed, threatened with violence, physically assaulted, or detained.Responses are on a Likert scale of 0-4: never; once; a few times; multiple times; every time I have done this work.We create a dichotomous variable "Violence experience any" which takes a one if the enumerator scored higher than a one on any of the types of violence experience.4. Witnessed violence: We also asked whether enumerators had witnessed violence during the course of fieldwork: have you ever witnessed the following while collecting data?Respondents answer whether they have seen (yes or not) mass protests, threats and harassment, physical violence, or theft/destruction of property.We create a measure of witnessed violence: "violence witness any," as whether the enumerator witnessed any of the types of violence.
The distribution of responses to the first two questions are shown in Figure 1.Three-quarters of enumerators we surveyed report fearing for their physical safety at least sometimes, with over a quarter saying they felt this way at least half the time.Similarly, over 50% report feeling unsafe at least sometimes during data collection.
Figure 2 displays the distribution of responses to our questions about enumerator's specific exposure to violence on the job.Almost a third of respondents indicated that they had been followed at least once while collecting data.As detailed by an enumerator with 14 years of experience, describing a survey during the war: "There were rebels everywhere and I had the impression that I was being followed.… the project I was piloting was on politics and … a respondent told us that since we [the survey team] had arrived we were identified and were being followed…" Twenty-six percent of respondents had been threatened with violence at least once.Concernedly, a smaller number of respondents (>10%) report having been robbed, physically assaulted or detained.These experiences happened at both the team and individual level.Describing conducting a survey in an informal gold mining site, an enumerator from Abidjan recounted being detained.The son of the head of the gold miners had first allowed the enumerators to work, only to refuse to "give them the road… it was a disaster"-an expression in local culture to indicate that they were only allowed to depart until late in the evening after the father's intervention.
A fifth of respondents witnessed mass protests while on the job.Protests and riots in Côte d'Ivoire routinely involve violence: 25% of protests from 2011-2020 6 resulted in at least one fatality-a conservative measure of the level of violence in protests, since protests can result in injuries and arrestsand 45% of protests involved government-led violence (ACLED 2021).Almost a fifth of respondents witnessed threats and harassment, while 11% witnessed physical violence.Taken together, these figures show clearly that enumerators are frequently exposed to violence while on the job.In subsequent analyses, we employ an additive "violence experience/fear" index that counts the number of positive responses to each of the four questions enumerated above. 7The distribution of this variable can be found in Figure 3.
Communities Affected by Violence.In addition to personal experience with violence, we also examine whether enumerators worked in violence affected communities.To measure this, we calculate the number of violent incidents (ACLED, 1997(ACLED, -2021) ) in departments where our enumerators have worked.We then determine if the department is a "high" violence department if the number of incidents is higher than the national average (excluding Abidjan).Finally, we calculate the share of violent departments that the enumerator has worked in outside of Abidjan. 8 Although this measure is not as precise as the idealwe do not collect data on enumerator perceptions of the village or respondent characteristics where they worked nor do they allow us to differentiate between an enumerator who has worked in one location several times versus someone who only worked once in a department-we assert that this serves as a good proxy for the socio-political context where enumerators have worked.We also note that these two variables-the violence index and the share of violenceaffected communities worked-do not actually correlate; in other words, enumerators experience fear and violence across the spectrum of their experience working in violence-affected communities.We therefore consider these measures independently in subsequent analyses, where we examine whether more enumerators who experienced or fear violence or who worked in violence-affected communities face challenges or data quality issues.

Enumerators Face Numerous Challenges in the Field
The second and related finding is that enumerators face a number of challenges when conducting academic surveys.We asked enumerators to document challenges to their working conditions, from issues around pay to feeling unsafe.The distributions of answers with responses of at least sometimes are shown in Figure 4's left-hand panel.The most common challenges experienced were related to working conditions: insufficient pay, insufficient food/accommodations, and difficult travel conditions.Seventy-six percent of respondents feared physical safety, while almost 40% felt otherwise uncomfortable in their assigned region.This puts fear and discomfort due to where the enumerators worked as the fourth most common challenge faced.
Overwhelmingly, enumerators who felt unsafe attributed this sentiment to the conditions they were asked to work in (see right-hand panel in Figure 4).A majority of respondents proposed the political environment (57%).As an enumerator recounted: "I found myself [in the southeast] during conflicts between populations after the legislative elections and I can say that as a supervisor in this field, I had the biggest fear for my team and me."Almost 40% of respondents also indicated high crime as a reason for fear while collecting data.Twenty-three percent of respondents reported harassment from either respondents or local authorities, while 28% felt otherwise unwelcome when conducting surveys.
While enumerators complained about insufficient pay and time commitments, we do not find a relationship between these critiques and exposure to violence.We do, however, show in Figure A4 in the Online Appendix that some difficult work conditions-such as difficult travel conditions and limited cell reception-covary with violence experience.We return to a discussion of this point under 'Implications.' Violence Experience and Challenges in the Field.Having demonstrated the scope of challenging conditions faced by enumerators in insecure settings, we next ask whether personal insecurity is associated with reported challenges.We examine two distinct categories of challenges that enumerators might face during the course of fieldwork: 1. Local actor challenges: For a suite of local actors-village chiefs, local political party leaders, mayors, prefects (regional administrators), police or military officials, youth leaders, respondents' families, respondents themselves, and other members of the local population-we asked whether such an actor had ever created problems.Specifically, we ask them to respond using a Likert scale (0-6), for each actor: I have never interacted with this person/people; this person/people have never created problems for me; this person/people have questioned my intentions; this person/people have threatened me; this person/people have physically intimidated me; this person/people have physically attacked me.We dichotomize these variables to indicate whether the actor posed a problem for the enumerator.2. Data collection difficulties: We ask: As an enumerator on these academic projects, your job is to collect the data for the researchers to use.With regards to the data collection process, how much do you agree with the following statements about the survey questions you ask: They are usually too complicated for the respondents; they are not usually relevant to the respondents' every day experiences; they need to be rephrased to help the respondent understand the researcher's intent; they are too sensitive or upsetting for respondents; they are out of touch with the cultural environment I work in; they are psychologically taxing for me as an enumerator.Possible responses on a Likert scale, from 0-6: strongly disagree; disagree; somewhat disagree; neither agree nor disagree; somewhat agree; agree; strongly agree.
Here we explore whether we see a higher number of enumerators who have experience with violence report other challenges during fieldwork.To be clear, in exploring these associations, we do not posit a causal interpretation of our findings.Our survey design does not allow us to test these relationships, as we did not ask enumerators to connect their reported exposure to violence with a specific project, period, or location or recount specific challenges associated with that exposure.This analysis is therefore suggestive; however, we believe that by highlighting these relationships, we draw attention to how enumerator experiences may shape the work they complete on behalf of academic researchers, an overlooked aspect of survey data collection.
We first examine the relationship between personal insecurity and other fieldwork challenges.The heat maps depicted in Figure 5 show the number of enumerators that affirmatively responded to the local challenges questions by violence experience and share of violence communities worked in.Notably, respondents and their families pose acute challenges as nearly 40% of enumerators we interviewed reported having faced some sort of challenge from respondents and a further 36% reported challenges from respondents' families.At the extreme, this involves acts of physical aggression.An enumerator from southwest Côte d'Ivoire who was working on a political project just after the war ended said that, after asking a few sensitive questions, the respondent "who was not in favor of the new regime, locked me in his house, telling me to call the one who sent me-the new president-to resuscitate his family members who died in this crisis."The heat maps depicted in Figure 5 convey that enumerators who have more experiences with violence, as measured by the index, as well as enumerators who have worked in a higher share of violence-affected communities report challenges by local community members.To illustrate the magnitude of these experiences, 64% of enumerators who reported challenges from respondents had at least one violent experience, while 44% had witnessed a violent experience, 85% feared their physical safety, and 71% of enumerators who faced challenges from respondents reported feeling unsafe during fieldwork.
Although our data do not allow us to discern temporal or causal sequences of these relationships, it does suggest clearly that these experiences-fear of and exposure to violence and experiencing local actor challenges-move closely together.Our qualitative data reveals how common forms of challenges, like verbal threats and harassment, increase enumerators' sense of discomfort and fear.A female enumerator described an interview that was interrupted by the respondent's friend, who threatened her and asserted that she was from the judiciary police: "I told him no and I even showed him my badge, but it was useless… thank God the respondent himself confirmed the work I was doing and said 'leave this lady alone'.And I was saved because it was a place where no one could save me except God."Another enumerator, originating and working in the far west, described being questioned by a group of young people: "they began to ask me questions to know for which political leader I was working, because there were people who came in the past, misrepresented themselves and engaged in actions that raised tensions in the community… some were agitated and made noises, but with the divine grace of God they ended up understanding me." Enumerators who experienced violence or feared physical security also have unique opinions about questions used in academic surveys, as shown in Figure 6.A larger concentration of enumerators who scored higher on the violence experience and fear index agree that questions are too sensitive.Enumerators who worked in a higher proportion of violence-affected communities indicated rephrasing of questions was needed to avoid issues with respondents.We also show that more enumerators who experienced violence or fear, and who worked in more violence-affected communities agreed that survey questions they were tasked with asking were psychologically challenging for the enumerator.

Enumerator Responses and Consequences for Data Validity
We next examine how enumerators try to overcome challenges introduced above by adapting or breaking survey protocols.We first presented a list of possible adaptations to or breaches of survey protocols on a 0-4 scale from never to always and asked if they believed that other enumerators would: skip difficult respondents or households; fabricate answers to survey questions; make the questions easier for respondents; adapt or rephrase questions to be less sensitive for respondents; abandon the questionnaire in the middle of the interview; deviate from the random-walk protocol; lie to supervisor about the reason an interview needed to be redone; and select respondents from household who are easier to survey.The intention behind asking about other enumerators was primarily to reduce potential social desirability bias, but we also believe that enumerators would have information about their peers.Enumerators often work in teams and have ample opportunity to share their experiences with each other during travel or downtime.Additionally, almost 50% of our sample has served as a supervisor at one point or another, increasing the likelihood that they would be privy to breaches that were reported or shared.
We secondly asked respondents how likely or unlikely (0-6) they themselves would be to adopt any of the following strategies if they felt unsafe or threatened: skip households; fill in answers; skip questions; choose respondents who are less challenging; abbreviate questions or abbreviate the consent process.
We identified these specific protocol adaptations and breaches in two ways.First, we drew on the Afrobarometer protocol, the most comprehensive public opinion survey on the continent and a commonly used template.The Afrobarometer protocol instructs enumerators to first select a household at random using a random walk protocol.Enumerators then use a 5/10 interval pattern to select households.According to the Afrobarometer manual, enumeration areas can be substituted in situations of "insecurity," but there is no formalized protocol for substituting households in similar situations.Instead, enumerators can indicate in the survey instrument reasons for no calls.If the selected respondent refuses to participate, the enumerator should replace the household by selecting the 10th household again following the random walk protocol.The manual is very clear that "we substitute households, not respondents." 9We secondly identify potential adaptations or breaches during the course of a survey by reading widely on public opinion research, with particular focus on work that examines how enumerators can influence question standardization and delivery (e.g., Blom & Korbmacher, 2013) as well as drawing on our own experiences conducting surveys in the region.
These protocol adaptations and breaches hold potentially serious consequences for the quality of data researchers collect.Take, for example, the question of emphasizing certain affiliations to gain community entry.At times, this might be benign; one enumerator recounted that she found it helpful to tell respondents that she "just wanted to know [their] party in order to take the same path as [them]" when having to ask about respondent partisanship, a strategy she found put them at ease and helped elicit sincere answers.But if enumerators cue partisan identities to gain access in this way, they could introduce bias in our estimates of party support, an important question to political scientists.This risk receives support in Marfouk et al. (2021), who find that Afrobarometer respondents who incorrectly perceive the government as the survey's sponsor are statistically more likely to report higher rates of trust in the incumbent party.This is a pressing area for future research.
Table 2 Summarizes potential risks to data validity for these protocol breaches, illustrating what this might look like in practice with concrete examples.The table is structured according to the flow of a survey.We encourage researchers to use this table as a guide to think through potential threats to data validity in their own research.We are agnostic as to which risks are more concerning because we expect it to be highly project specific.Still, it will likely be the case that issues pertaining to question response procedures will be critical in survey experiments, where consistency in question set-up and wording is particularly important.Some of these sources of bias might be easily addressed by looking at balance tests across enumerators, but others are by their very nature more opaque and hard to assess from afar.Regardless of the scope of bias, it is important for survey researchers to recognize the real potential for these types of violations and think through potential consequences for their data.
The percentage of respondents indicating that they believe their peers would break protocols are displayed in the top panel of Figure 7. Respondents are quite likely to think their peers break three protocols in particular: 82% of respondents agreed that their peers would at least sometimes abandon a survey underway if they felt unsafe, 74% that peers would adapt or rephrase questions, and nearly 70% thought they would skip households.A further 37% thought it was likely their peers would adapt questions for respondents who struggled with answering, such as shortening long texts or explaining what a question 'really means.' Enumerators' reported likelihood that they themselves would adopt strategies that break protocol is shown on the bottom panel of Figure 7. Contrary to our expectation that enumerators would be more hesitant to reveal that they would break protocol, we find this not to be the case.Seventy-one percent of respondents admitted that they would skip households, which is not necessarily a breach of protocol; households can be replaced in the Afrobarometer manual, for example.However, frequently skipping households due to safety or circumventing whole neighborhoods may introduce bias that we may not account for in research design and analyses.Almost a third of respondents agreed that they would select easier respondents, effectively replacing respondents and not households.Although we do not know how motivated by social desirability our enumerators were when they responded to these questions, the extremely high percentages of enumerators indicating that they themselves or their peers are likely to select easier households or respondents is notable.
Are enumerators who are exposed to violence, insecurity, and local challenges during fieldwork more or less likely to report compromises to data quality?We preface this discussion by stressing that enumerators expressed deep commitments to their profession and were proud of their work.Ninety-Figure 7. To deal with challenges, how often do… eight percent of respondents felt like they were an important element in the research process.Every enumerator thought that it was important to participate in research that improves conditions in their country (see Figure A5 in the Online Appendix).The discussion below should not obscure the professionalism expressed by our respondents, therefore, but rather reflects the challenges of working in difficult environments.
Figure 8 depicts heat maps of the number of enumerators who reported that others routinely break protocols, while Figure 9 shows the number of enumerators who reported that they themselves break protocols.In contrast to the challenges results described above, there appears to be a less clear relationship between violence experience and protocol breaches.Two things are notable about Figure 8. First, in general greater exposure to violence or affected communities correlates with an increased likelihood of deviation from survey protocols.Here, we see a higher concentration of enumerators reporting protocol breaches if they have worked in a higher share of violence-affected communities.Second, enumerators are more likely to deviate from some protocols than others.Most notably, enumerators broadly think that others (and themselves) skip households when feeling unsafe, and that others will adapt questions or abandon the questionnaire.Of course, as noted, skipping households is not a breach of protocol and we would never suggest that enumerators interview households that make themselves uncomfortable.What we do stress, however, is that when we as researchers are unaware of high rates of skipped households, we may misunderstand potential biases in Figure 8. Experience with violence and others break protocol.our data.For example, one common story that emerged in the qualitative section of our survey was that husbands often did not want their wives to be surveyed.A female enumerator, despite her gender, recounted being chased away by a husband for wanting to interview his wife.Together with her supervisor, they decided to skip the household all together.This has implications for data collection: if enumerators skip "difficult" households in this way, the opinions of women with such partners-who may be a specific population-will be systematically screened out.
Another ostensibly benign way that enumerators may break protocol is by emphasizing an identity characteristic to put respondents at ease.Most surveys in the social sciences, including the Afrobarometer, stress the need for enumerators to read a set introductory script that transparently states the scientific intentions and neutrality of the research team.Emphasizing an affiliation like partisanship reduces the likelihood that an enumerator will be seen as neutral to the respondent, an important component in the data collection process.A final question about adherence to protocols asks respondents "Have you ever emphasized the following affiliations or identity to make an interviewee comfortable?" on a Likert scale (0-4) from never to always, about the following affiliations: the enumerator's ethnicity, political party, or religion.
We find that 64% of enumerators have emphasized an affiliation to put respondents at ease.The largest proportion of enumerators emphasized their ethnicity (See Figure 10), while almost a third emphasized their religion.Examples of how enumerators deploy these affiliations to avoid problems with respondents emerge from the qualitative data.A non-Muslim enumerator attempted to appeal to a respondent who was challenging him: "during our interview, the [head of household] was furious, he threatened to kill me because according to him I had come to look for his wife in his own house.I tried to calm him down… he asked if I was Muslim and I said no, that I was a Christian, which made the situation worse… in a rush, I told him a proverb in Malinké ([the head of household's] language) which said 'we are all children of Allah'… after a few questions and answers he agreed to continue the interview on the condition that he is next to us." The heat maps depicted in Figure 11 show that enumerators who report having emphasized their ethnicity or religion have worked in more violenceaffected communities or had themselves feared or experienced violence.Interestingly, we show in Table A10 in the Online Appendix that enumerators do not think their co-ethnics are easier to survey than the out-group; 53% of enumerators said that it is just as easy to survey in-group as out-group members.This suggests that these effects are specific to violent contexts; enumerators are not cuing ethnicity with co-ethnics in line with expectations from previous work (Adida et al., 2016), but rather appear to be using ethnicity as a means of entry in violence-affected communities or when they themselves have been exposed to violence.While we recognize that enumerators are often trained to create a connection with respondents, cuing identities that prime respondents to think about contentious political dynamics could have implications for their responses.Across a range of variables we show that enumerators who faced personal insecurity during data collection also report more challenges from local populations, express difficulties with some types of survey questions, and sometimes deviate from research protocol.We next address portability of our findings and suggest implications, as well as possible remedies for these findings.

Generalizability of Findings
Our results risk being specific to Côte d'Ivoire.To address this concern, we make use of an underutilized set of questions asked of enumerators at the end of each Afrobarometer interview in Round 7 (34 countries).Specifically, the survey asks enumerators to answer a series of questions about whether they were or felt threatened during the interview and the attitude of the respondent.Enumerators across Africa reported that they were or felt threatened in two percent of interviews, while less than one percent of interviews involved a hostile respondent, and three percent involved a suspicious respondent.Côte d'Ivoire's reported threats are higher than the median in the Afrobarometer sample at nine interviews out of 1200 (see Figure A6 in the Online Appendix).
Though these numbers may appear low, it is important to remember that this data is at the interview-level; our data looks at enumerator experiences over the course of several projects.If we aggregate the Afrobarometer data to the enumerator-level (N = 1076), 16% of enumerators faced at least one hostile respondent, 16% faced a threatening respondent, and 37% faced a suspicious respondent.In many cases, the enumerators were confronted with multiple difficult interviews over the course of the study: an enumerator in South Africa faced 42 threatening interviews, while an enumerator in Mozambique faced 30 hostile respondents and 51 suspicious respondents.
We next examine whether Afrobarometer enumerators working in insecure environments face challenges from respondents.We create a series of proxies for an environment of insecurity for the enumerators by measuring enumeration area-level insecurity: the share of respondents reporting that they feel unsafe, were physically attacked, were a victim of a crime, and reported that insecurity is the top problem.An insecure enumeration-area is one that scores higher than the average for any of these measures.As we show in Figure A7 in the Online Appendix, in most countries, a higher proportion of enumerators working in these areas report hostile, suspicious, or threatening respondents compared to enumerators working in low insecurity enumeration areas.
Taken together, these results are suggestive that even across different country contexts, enumerators may be working in insecure research environments with challenging respondents.To our knowledge, no academic papers leverage these enumerator-specific questions in the Afrobarometer to answer questions we ask in this paper, nor to control for threats in analysis of other questions in the survey.

Implications
The findings presented above hold implications for two important dimensions of social science research.First, they urge us to continue efforts towards establishing new disciplinary norms around research in insecure settings.In particular, we echo and expand upon recent calls for scholars to include ethics appendices for papers involving fieldwork (Asiedu et al., 2021) or otherwise detail precautionary steps taken to mitigate risk (a good example of which can be found in Rudloff & Vinson, 2021). 10As has been noted by Lupu and Michelitch (2018, p. 206), the only formal ethical gatekeepers for field research-institutional review boards-restrict their purview to risks faced by research participants and not research staff.This effectively renders us as researchers the only monitors of field staff well-being.Unfortunately, the increasing availability of local research institutes and survey firms is a doubleedged sword in this respect: it facilitates large-scale data collection from afar while simultaneously rendering it easier for us to ignore this responsibility.Efforts to establish norms of documenting and detailing strategies taken to minimize risks to enumerators should be encouraged widely.We envision an encompassing norm that is not limited to enumerators, but rather one that covers local research staff writ large, whether work be qualitative, observational, or experimental in nature. 11 Second, and perhaps most consequential for researchers, our findings can be read as a call to alter our research designs to minimize risks to field staff.We would flag that although our own sample is limited to enumerators working in a post-conflict setting, we believe-in line with the Afrobarometer results discussed in the previous section-that much of what we discuss below pertains broadly to contexts of high crime, insecurity and on-going violence.We detail six key points in the research process where we believe systematic alterations are needed, acknowledging throughout that these may raise both temporal and financial costs to research.
• While elaborating research designs, scholars should be attentive to the timing and content of their surveys.Particularly challenging for political scientists, enumerators we surveyed frequently suggested avoiding election periods if possible.Certainly, we do not think that political scientists should stop running surveys leading up to elections or other politically tense moments.However, working in such contexts does pose a particular responsibility.Working with local research teams when planning survey implementation could help determine the best times and locations that would reduce risks to all participants (Davis, 2020).If surveys must be run during high-risk periods, it is imperative to maintain contact with the local teams and be prepared to avert, suspend, or delay data collection if enumerators perceive security threats.Researchers need to maintain editorial control over their research, but we also need to work with local conditions and contexts to produce internally valid research.• During project implementation, social scientists have a responsibility to inform themselves of enumerators' working conditions, which hold specific implications for field management.Researchers should familiarize themselves about firms' hiring practices, for example.While some recruit enumerators largely from urban areas, often the capital, others recruit in a more decentralized fashion.We might imagine that enumerators would have a better sense of risks and challenges they might face in the latter case and could be useful sounding boards prior to beginning fieldwork.Researchers could reserve time during recruitment and training to have frank conversations with survey teams about potential risks to their security and mental health as well as developing collective strategies to deal with common challenges, such as suspicious respondents or politically tense communities (see discussion in Syahar & Soedirgo, 2019).This could include lengthier training sessions that serve to both improve the quality of the data collected while also allowing researchers to build rapport with enumerators and facilitate the flow of information about working conditions.A second aspect of this question is more general.We ended our survey by asking enumerators what feedback they had for the researchers of the last study they worked on.Over half of our respondents wrote something pertaining to working conditions, ranging from insufficient or delayed pay (16.5%) to concerns about transportation and lodging (4%).We can only speculate about the relationship between working in insecure environments and concerns over these conditions; at the least, we find no indication that enumerators are being compensated for working in insecure environments.What we can say is that many enumerators expressed genuine interest in the work, appreciating among other things the professional skills they learned, the ability to travel and learn about their own countries and were committed to research they believe could help their communities.If nothing else, the literature on research brokers makes clear that enumerators are not low-skilled workers and we should not treat them as such.• Piloting of surveys offers a unique opportunity to assess whether questions will pose specific threats to enumerators.Eighty-seven percent of our respondents said that surveys would be improved if they were first consulted by researchers.Specifically, enumerators felt that including them in the development of questionnaires would help ensure questions were culturally sensitive and adapted to local realities. 12These concerns were wide ranging, with enumerators citing both political questions but also socio-economic questions, which many noted were perceived by respondents as being highly personal, raising suspicions about enumerator intent.This both holds implications for the rapport between respondent and enumerator, while also raising important questions about how we understand what is sensitive in different contexts.Soliciting enumerator feedback about question wording prior to and following piloting would be one avenue to assess this.This may pose thorny trade-offs for researchers: questions that are designed from afar to achieve social scientific goals may need to be reevaluated if enumerators find they pose challenges.Inversely, by incorporating local feedback, survey questions may obtain greater construct validity because they are attuned to local contextual factors.• As has been suggested by others (e.g., Baron & Young, 2021), researchers working on particularly sensitive topics may consider providing psychosocial support if they expect enumerators may experience secondary trauma.Providing tools that can support enumerator mental health of working may also help to ensure their well-being and safety (Herman et al., 2022).A less costly option is suggested by Paluck (2009) and Rudloff and Vinson (2021), who note the importance of team debriefs as a means to both maintain morale and decompress and for researchers to assess challenges and concerns.For scholars working remotely, requesting research team contact information would allow check-ins to gain awareness of any challenges encountered during implementation.• Researchers should also routinely collect data on enumerator characteristics, prior to and after survey completion.Understanding enumerator and survey firm positionality may help researchers alleviate potential challenges encountered (Davis & Michelitch, 2022;Haas et al., 2022).In addition to collecting standard demographic data, researchers in violent contexts should aim to understand enumerators past experiences with violence, for example, incorporating violence exposure into post-interview questionnaires.This could concretely include questions like those used in the Afrobarometer, but also questions precising impressions of the enumeration area and household dynamics.For example, did the enumerator face challenges to entry to the community or household and how threatening were these encounters?Gathering data on past exposure to violence could allow researchers to test hypotheses we suggest here, or at a minimum control for these characteristics in analyses.We include our enumerator survey instrument in the Online Appendix to provide potential question wording that could be adapted to fit researcher needs.• Finally, we encourage researchers to follow-up with enumerators after survey completion, as suggested by our respondents.This moment may be particularly fruitful to gain information that enumerators may have been hesitant to reveal during implementation.
Cumulatively, our results suggest two distinct research agendas for social scientists interested in understanding enumerator experiences.The clearest extension continues our focus on enumerators to explore their experiences in settings with different political dynamics.Future extensions could examine enumerator experiences in authoritarian regimes, contexts of high crime, and politically polarized settings.In these cases, enumerators may face unique and previously unobserved challenges worth studying.
The second agenda we see emerging from our findings is a need to investigate mechanisms linking exposure to violence during the job to breaches in protocol.Do enumerators exposed to violence in time t change their behavior only during that project, or does it forever shape their experience?Better designed survey instruments may be able to get at this question than our study has done.With an eye toward improving enumerator experiences, can interventions to provide enumerators with mental health tools, like the ones suggested above, reduce negative effects of violence exposure for their own safety and on data quality?
Our findings also return us to the respondent.By far the most common challenge noted by enumerators in open-ended questions was convincing respondents to participate; almost 50% of enumerators discussed difficulties with respondents in their qualitative answers, ranging from reluctant respondents to respondent fatigue to threats of violence.Such hesitancy could be a lack of institutional trust or misunderstanding about survey objectives.This problem raises important questions about how public opinion research is understood in contexts of low literacy and/or high insecurity, though it is hardly unique to the Global South.Initial work in this vein can be found in Gengler et al. (2021), who study attitudes towards public opinion surveys in Qatar.We encourage more work of this nature that examines whether respondents have underlying expectations or beliefs that complicate data collection.Understanding such perceptions could offer ideas to alleviate these concerns and, by extension, a common source of stress (and indeed threat) for enumerators.Increases in global internet connectivity may offer one avenue for researchers to do better vis-à-vis respondents: researchers could provide respondents with a website and activation date where respondents can see aggregated study results, for example, or send a summary via text. 13

Conclusion
Survey research is a cornerstone of social scientific research, and it has taken on new prominence in research on political violence and the Global South more broadly.Although this project is primarily descriptive, we have shed light on the unique challenges that enumerators in insecure settings may face while collecting data on behalf of academic researchers.Our results suggest that survey data collected in insecure settings is shaped by challenges enumerators face on the ground and solutions they adopt to address them.Importantly, we shift focus from enumerators' ascriptive characteristics, such as ethnicity or religion, to their lived experiences during data collection.We show in this paper that enumerator challenges to collecting data in insecure settings are greatly shaped by their personal experiences with violence.Enumerators who experience insecurity-either through fear, personally experiencing violence, or witnessing it-are more likely to face safety challenges and sometimes break important research protocols that have widesweeping implications for data quality in social sciences.This suggests that concerns over enumerator actions during fieldwork, such as fabricating data or skipping households, cannot neglect the role that enumerator safety plays in shaping the solutions they adopt.
Collectively, we hope that this paper helps to render enumerators' experiences visible by including their voices in our understanding of the research process.We believe that our findings hold broad implications for the social sciences: when we ask enumerators to work in potentially insecure communities, we must take these considerations into account.We have suggested ways to improve conditions in which enumerators work and to ensure better data quality for researchers as well as areas for pressing future research, including efforts to more precisely map enumerator experiences onto potential sources of bias.We hope that the findings here contribute to an expanded dialogue around the ethics of conducting research in insecure settings across the Global South.

Figure 3 .
Figure 3. Distribution of enumerator responses to violence/fear index and violenceaffected communities.

Figure 5 .
Figure 5. Relationship between insecurity and challenges to safety.

Figure 6 .
Figure 6.Relationship between insecurity and data collection difficulties.

Figure 9 .
Figure 9. Experience with violence and self-break protocol.

Figure 10 .
Figure 10.How often do you emphasize the following.

Figure 11 .
Figure 11.Experience with violence and affiliations.
Note: Projects N is coded

Table 2 .
Consequences for data validity from enumerator field decisions.