Analysis of Incremental Design Changes in Video Games with Automatic Exploration
Videogames are software systems expressed in structure code and data, but it’s far from obvious how changing one bit of code will impact the experience a human player might have with the game. Traditional playtesting and quality assurance testing remains the gold standard for ensuring a quality gaming experience, however there are considerable resource and morale requirements necessary to ensure that testing procedures will unveil problems in the user experience. Further, these testing traditions do not scale down to incremental, in-development, builds of videogame software, placing the entirety of videogame quality control at the ends of vast periods of coding.
With the advent of advanced AI gameplaying algorithms, I present a way to leverage gameplaying AI as an assistive tool for game developers. Rather than playing to win, these AI techniques aim to explore the playable areas of a videogame that a human player could encounter, potentially encountering areas of gameplay that developers did not intend to implement. I have developed two exploration techniques, one which relies on human gameplay traces, and one which self-improves with only a single seed of human gameplay. With this information, I also present a visualization workflow to generate visual reports of gameplay differences between versions of a videogame.