Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention
UNIVERSITY OF SOUTHERN CALIFORNIA LOS ANGELES DEPT OF COMPUTER SCIENCE
Pagination or Media Count:
We describe a neurobiological model of visual attention and eyehead movements in primates, and its application to the automatic animation of a realistic virtual human head watching an unconstrained variety of visual inputs. The bottom-up image-based attention model is based on the known neurophysiology of visual processing along the occipito-parietal pathway of the primate brain, while the eyehead movement model is derived from recordings in freely behaving Rhesus monkeys. The system is successful at autonomously saccading towards and tracking salient targets in a variety of video clips, including synthetic stimuli, real outdoors scenes and gaming console outputs. The resulting virtual human eyehead animation yields realistic rendering of the simulation results, both suggesting applicability of this approach to avatar animation and reinforcing the plausibility of the neural model.
- Anatomy and Physiology