Conference
arXiv.org, 2021
APA
Click to copy
Alicea, B., Chakrabarty, R., Gopi, A., Lim, A., & Parent, J. (2021). Embodied Continual Learning Across Developmental Time Via Developmental Braitenberg Vehicles. In arXiv.org.
Chicago/Turabian
Click to copy
Alicea, Bradly, Rishabh Chakrabarty, Akshara Gopi, Avery Lim, and Jesse Parent. “Embodied Continual Learning Across Developmental Time Via Developmental Braitenberg Vehicles.” In ArXiv.org, 2021.
MLA
Click to copy
Alicea, Bradly, et al. “Embodied Continual Learning Across Developmental Time Via Developmental Braitenberg Vehicles.” ArXiv.org, 2021.
BibTeX Click to copy
@conference{bradly2021a,
title = {Embodied Continual Learning Across Developmental Time Via Developmental Braitenberg Vehicles},
year = {2021},
journal = {arXiv.org},
author = {Alicea, Bradly and Chakrabarty, Rishabh and Gopi, Akshara and Lim, Avery and Parent, Jesse}
}
There is much to learn through synthesis of Developmental Biology, Cognitive Science and Computational Modeling. One lesson we can learn from this perspective is that the initialization of intelligent programs cannot solely rely on manipulation of numerous parameters. Our path forward is to present a design for developmentally-inspired learning agents based on the Braitenberg Vehicle. Using these agents to exemplify artificial embodied intelligence, we move closer to modeling embodied experience and morphogenetic growth as components of cognitive developmental capacity. We consider various factors regarding biological and cognitive development which influence the generation of adult phenotypes and the contingency of available developmental pathways. These mechanisms produce emergent connectivity with shifting weights and adaptive network topography, thus illustrating the importance of developmental processes in training neural networks. This approach provides a blueprint for adaptive agent behavior that might result from a developmental approach: namely by exploiting critical periods or growth and acquisition, an explicitly embodied network architecture, and a distinction between the assembly of neural networks and active learning on these networks.