Search for a command to run...
Perception has traditionally been conceptualized as the internal reconstruction of external stimuli, both in cognitive science and in artificial intelligence (AI). In this representational view, sensory systems transform input into internal models that guide cognition and action. However, converging evidence from neuroscience, perceptual science, developmental psychology, autism research, robotics, and contemporary AI increasingly challenges this assumption. Across these domains, perception appears to emerge through active, embodied engagement with the environment rather than through passive signal processing or static internal representation. Embodied cognition theories propose that perceptual meaning arises from lawful relations among bodily constraints, action, temporal coordination, and environmental feedback, emphasizing perception as an ongoing process of interaction. In parallel, recent advances in AI have shifted away from purely feedforward or data-driven perceptual architectures toward closed-loop, predictive, and self-organizing systems in which perception and action are inseparable components of adaptive behavior. Approaches such as embodied reinforcement learning, active inference, and world-model-based learning increasingly treat perception as emerging through sensorimotor interaction and temporally structured regulation rather than inference alone. This theoretical paper integrates embodied cognition with contemporary AI-driven models of perception, arguing that embodiment functions as a generative constraint enabling robust, context-sensitive, and developmentally grounded sensory cognition across biological and artificial systems. We further extend this framework to autism spectrum disorder (ASD), proposing that many sensory–perceptual differences in autism can be understood as variations in embodied self-organization, predictive regulation, and temporal coordination rather than as deficits in abstract cognition. Finally, we discuss how embodied AI systems can serve as formal testbeds for exploring autism-relevant perceptual mechanisms and for designing adaptive, interaction-based technologies that support perceptual coherence without imposing normative behavioral models.