Students studying complex science topics can benefit from receiving immediate, personalized guidance. Supporting students to revise their written explanations in science can help students to integrate disparate ideas and develop a coherent, generative account of complex scientific topics. Using natural language processing to analyze student written work, this dissertation compares forms of automated guidance designed to motivate productive revision and help students improve their science understanding. Online environments can support science learning by providing timely, personalized guidance to students, but challenges for effective implementation still exist. Specifically, (a) students often believe online guidance is generic rather than personalized for them; and (b) students do not always engage effortfully with online guidance and improve their written responses. This dissertation includes a series of three studies that address these challenges. A computerized learning environment is used to explore useful and motivating forms of automated guidance for middle school students learning challenging science topics such as thermodynamics.
Informed by the knowledge integration framework and established ideas about student motivation, these research studies examine effective designs for automated guidance. Study 1 demonstrates that automated knowledge integration guidance provided by a computer can promote integrated understanding of science as effectively as expert teacher guidance. In addition, students who began to distinguish scientific ideas after receiving guidance in the embedded assessment, as evidenced by adding either non-normative or normative ideas to their response, made greater gains over the course of the unit than those who did not add any new ideas in response to guidance. However, in this study some students discounted automated computer guidance, assuming it was generic rather than personalized. In Study 2, transparent guidance clarified to students how the computer generated personalized guidance based on their response. Results showed that transparent personalized guidance had a greater impact than standard adaptive guidance on student revisions, suggesting that student beliefs about how guidance is designed influences their performance. Transparent guidance was particularly effective for students who started with a low initial score. This finding resonates with the idea that students who felt they were struggling may have particularly benefited from the reassurance that automated guidance was provided at a level they were expected to be able to achieve. Study 3 compares two specific guidance strategies: revisiting evidence or planning writing changes, prior to revision. Analysis of student actions after receiving guidance demonstrated that students in the revisiting evidence condition were more likely to revisit prior evidence, and students in the planning writing condition were more likely to make significant writing revisions. Both revisiting and planning guidance resulted in significant improvement in student knowledge integration, although neither guidance strategy showed a significant advantage over the other. In addition, we found that the form of guidance interacted with school, suggesting that teacher practices could reinforce a specific guidance strategy.
This sequence of studies shows that the design of online guidance is important in encouraging students to revisit dynamic models and make effortful revisions to their work. Carefully designed automated guidance can augment the effectiveness of teachers by motivating students to better use computer learning environments and make effortful revisions that ultimately improve science learning. The results also raise important questions about when to encourage revisiting, how to design instruction that best fits with individual classroom strategies, and how to instill a lifelong practice of engaging in iterative refinement of scientific explanations.