Recent progress in artificial intelligence has renewed interest in
building machines that learn like animals. Almost all of the
work comparing learning across biological and artificial systems
comes from studies where animals and machines received
different training data, obscuring whether differences between
animals and machines emerged from differences in learning
mechanisms versus training data. We present an experimental
approach—a “newborn embodied Turing Test”—that allows
newborn animals and machines to be raised in the same
environments and tested with the same tasks, permitting direct
comparison of their learning abilities. To make this platform, we
first collected controlled-rearing data from newborn chicks, then
performed “digital twin” experiments in which machines were
raised in virtual environments that mimicked the rearing
conditions of the chicks. We found that (1) machines (deep
reinforcement learning agents with intrinsic motivation) can
spontaneously develop visually guided preference behavior, akin
to imprinting in newborn chicks, and (2) machines are still far
from newborn-level performance on object recognition tasks.
Almost all of the chicks developed view-invariant object
recognition, whereas the machines tended to develop
view-dependent recognition. The learning outcomes were also
far more constrained in the chicks versus machines. Ultimately,
we anticipate that this approach will help researchers develop
embodied AI systems that learn like newborn animals.