Skip to main content
eScholarship
Open Access Publications from the University of California

Using a Language Transformer Model to Capture Creativity in Improvised Narratives

Creative Commons 'BY' version 4.0 license
Abstract

Humans often communicate through spoken or written narratives, and assessing story creativity is typically thought to be a highly subjective and uniquely human ability. To challenge this assumption, we explored whether a language transformer model (BERT) could generate metrics to assess narrative creativity automatically. We collected 790 audio-recorded improvised stories based on varying prompts and used a subset of their transcripts (18) in this preliminary study. Stories with a higher average BERT semantic embedding distance between all sentences were rated as more imaginative (r = 0.48, p = 0.044) and more complex (r = 0.52, p = 0.028) by the average of seven creative storytelling experts. Additionally, sentence-level embedding distances predicted human ratings better than word-level embedding distances (p < 0.05). Together, these findings highlight BERT as a useful tool for automatically assessing narrative creativity and invite further fine-grained investigation of the features that describe creativity in natural communication.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View