Exploring the Racial Bias in Pain Detection with a Computer Vision Model
People detect painful expressions more easily in members of their racial ingroup than outgroup. Here, we wanted to investigate this racial bias with a machine learning model trained to detect activations of different action units of painful facial expressions. We examined whether the model detected higher action unit activation for European than African faces when trained on datasets with mostly White faces. To control for confounding variables, pictures of faces were generated with the FaceGen Modeller. Results revealed that there exist differences in the visual detectability of some facial muscle activations due to skin color or other race-dependent facial features. Despite the bias towards European looking faces in the training data, some activations were more easily detectable in African faces. Thus, neither the perceptual detectability, nor the larger exposure to own-race faces seems to solely explain the racial bias in pain detection.