Skip to main content
Open Access Publications from the University of California

Racial bias in implicit danger associations generalizes to older male targets

  • Author(s): Lundberg, GJW
  • Neel, R
  • Lassetter, B
  • Todd, AR
  • et al.

© 2018 Lundberg et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Across two experiments, we examined whether implicit stereotypes linking younger (~28-year-old) Black versus White men with violence and criminality extend to older (~68-year-old) Black versus White men. In Experiment 1, participants completed a sequential priming task wherein they categorized objects as guns or tools after seeing briefly-presented facial images of men who varied in age (younger versus older) and race (Black versus White). In Experiment 2, we used different face primes of younger and older Black and White men, and participants categorized words as ‘threatening’ or ‘safe.’ Results consistently revealed robust racial biases in object and word identification: Dangerous objects and words were identified more easily (faster response times, lower error rates), and non-dangerous objects and words were identified less easily, after seeing Black face primes than after seeing White face primes. Process dissociation procedure analyses, which aim to isolate the unique contributions of automatic and controlled processes to task performance, further indicated that these effects were driven entirely by racial biases in automatic processing. In neither experiment did prime age moderate racial bias, suggesting that the implicit danger associations commonly evoked by younger Black versus White men appear to generalize to older Black versus White men.

Many UC-authored scholarly publications are freely available on this site because of the UC Academic Senate's Open Access Policy. Let us know how this access is important for you.

Main Content
Current View