- Main
Dynamic Control Under Changing Goals
Abstract
Acting effectively in the world requires a representation that can be leveraged to serve one’s goals. One practical reason thatintelligent agents might learn to represent causal structure is that it enables flexible adaptation to a changing environment.For example, understanding how to play a videogame allows one to pursue other goals such as doing as poorly as possibleor only gathering one type of item. Across two experiments that manipulated the expected utility of learning causalstructure, we find that people did not build causal representations in dynamic environments. This conclusion was supportedby behavioral results as well as by participants being better fit by models describing them as utilizing minimally complex,reactive control policies. The results show how despite being incredibly adaptive, people are in fact computationally frugal,minimizing the complexity of their representations and decision policies even in situations that might warrant richer ones.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-