Towards Artificial Systems: What Can We Learn from Human Perception?

Research in learning algorithms and sensor hardware has led to rapid advances in artificial systems over the past decade. However, their performance continues to fall short of the efficiency and versatility of human behavior. In many ways, a deeper understanding of how human perceptual systems process and act upon physical sensory information can contribute to the development of better artificial systems. In the presented research, we highlight how the latest tools in computer vision, computer graphics, and virtual reality technology can be used to systematically understand the factors that determine how humans perform in realistic scenarios of complex task-solving.

[1]  Heinrich H. Bülthoff,et al.  Acquiring Robust Representations for Recognition from Image Sequences , 2001, DAGM-Symposium.

[2]  Alex O Holcombe,et al.  Visual Binding of English and Chinese Word Parts is Limited to Low Temporal Frequencies , 2007, Perception.

[3]  M J Tarr,et al.  What Object Attributes Determine Canonical Views? , 1999, Perception.

[4]  Max Mulder,et al.  The Effect of Simulator Motion on Pilot Control Behaviour for Agile and Inert Helicopter Dynamics , 2009 .

[5]  Christian Wallraven,et al.  Perceptual representations of parametrically-defined and natural objects comparing vision and haptics , 2010, 2010 IEEE Haptics Symposium.

[6]  Heinrich H. Bülthoff,et al.  Recognising novel deforming objects , 2005 .

[7]  Heinrich H. Bülthoff,et al.  Human observers use personal exploration patterns in novel object recognition , 2007 .

[8]  Laurie M Wilcox,et al.  A reevaluation of the tolerance to vertical misalignment in stereopsis. , 2009, Journal of vision.

[9]  Dana H. Ballard,et al.  Modeling embodied visual behaviors , 2007, TAP.

[10]  H. Bülthoff,et al.  The use of facial motion and facial form during the processing of identity , 2003, Vision Research.

[11]  Matthias S. Keil,et al.  “I Look in Your Eyes, Honey”: Internal Face Features Induce Spatial Frequency Preference for Human Face Processing , 2009, PLoS Comput. Biol..

[12]  Martin A. Giese,et al.  Dynamic Faces: Insights from Experiments and Computation , 2010 .

[13]  Mario Innocenti,et al.  Towards Real-Time Aircraft Simulation with the MPI Motion Simulator , 2009 .

[14]  H. Bülthoff,et al.  Effects of temporal association on recognition memory , 2001, Proceedings of the National Academy of Sciences of the United States of America.

[15]  Heinrich H. Bülthoff,et al.  View-based dynamic object recognition based on human perception , 2002, Object recognition supported by user interaction for service robots.

[16]  Jan J. Koenderink,et al.  Editorial: Walking in real and virtual environments , 2007, TAP.

[17]  Heinrich H. Bülthoff,et al.  Recognizing novel deforming objects , 2005, APGV '05.

[18]  Heinrich H. Bülthoff,et al.  Categorization of natural scenes: Local versus global information and the role of color , 2007, TAP.

[19]  Heinrich H. Bülthoff,et al.  Eye and pointer coordination in search and selection tasks , 2010, ETRA.

[20]  Malcolm R. Davidson,et al.  A Model Analysis of Arterial Oxygen Desaturation during Apnea in Preterm Infants , 2009, PLoS Comput. Biol..

[21]  Christian Wallraven,et al.  The role of characteristic motion in object categorization. , 2004, Journal of vision.

[22]  Isabelle Bülthoff,et al.  Gaze behavior in face comparison: The roles of sex, task, and symmetry , 2009, Attention, perception & psychophysics.

[23]  Heinrich H. Bülthoff,et al.  Automatic acquisition of exemplar-based representations for recognition from image sequences , 2001, CVPR 2001.

[24]  Heinrich H. Bülthoff,et al.  Measuring unrestrained gaze on wall-sized displays , 2010, ECCE.

[25]  Guy Wallis,et al.  Learning Illumination-and Orientation-invariant Representations of Objects through Temporal Association General Methods Experiment Ii , 2022 .