What has been missed for real life driving? an inspirational thinking from human innate biases

Nature is gorgeous for her imbalance. The innate bias from non-human to human results in a wonderful yet mysterious biological foundation towards the inspirational thinking for real application. The vision researchers evidenced that the left gaze bias in humans and non-humans. Nevertheless, the acousticians observed the right ear advantages in both non-humans and humans. Unlike the vision and acoustician researchers investigating the underlying mechanisms of human innate bias, we are more interested in mimicking these characteristics. In this paper, we propose two simple yet effective methods to generate the left eye gaze bias and the right ear advantage. We further discuss the potential applications, e.g., real life driving, from these inherent phenomena. We believe that this paper could bring an inspirational impact for future cognitive transportation, by implementing these human innate biases properly.

[1]  C. Koch,et al.  A saliency-based search mechanism for overt and covert shifts of visual attention , 2000, Vision Research.

[2]  Thomas L. Szabo,et al.  Time domain wave equations for lossy media obeying a frequency power law , 1994 .

[3]  R. Desimone,et al.  Neural mechanisms of selective visual attention. , 1995, Annual review of neuroscience.

[4]  Marcel Kinsbourne,et al.  Attention and the right-ear advantage: What is the connection? , 2011, Brain and Cognition.

[5]  Christof Koch,et al.  Bottom-up visual attention to salient proto-object regions , 2010 .

[6]  Richard Kronland-Martinet,et al.  Asymptotic wavelet and Gabor analysis: Extraction of instantaneous frequencies , 1992, IEEE Trans. Inf. Theory.

[7]  Antonio Torralba,et al.  Top-down control of visual attention in object detection , 2003, Proceedings 2003 International Conference on Image Processing (Cat. No.03CH37429).

[8]  Patrick Le Callet,et al.  A coherent computational approach to model bottom-up visual attention , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  Jiawei Xu,et al.  Mimicking visual searching with integrated top down cues and low-level features , 2014, Neurocomputing.

[10]  E. Gardner,et al.  An Exactly Solvable Asymmetric Neural Network Model , 1987 .

[11]  G. Rizzolatti,et al.  The mirror neuron system. , 2009, Archives of neurology.

[12]  Peter Dayan,et al.  Computational Differences between Asymmetrical and Symmetrical Networks , 1998, NIPS.

[13]  Jiawei Xu,et al.  A top-down attention model based on the semi-supervised learning , 2012, 2012 5th International Conference on BioMedical Engineering and Informatics.

[14]  J. Vauclair,et al.  Right ear advantage for conspecific calls in adults and subadults, but not infants, California sea lions (Zalophus californianus): hemispheric specialization for communication? , 2005, The European journal of neuroscience.

[15]  Daniel Mills,et al.  Left gaze bias in humans, rhesus monkeys and domestic dogs , 2009, Animal Cognition.

[16]  G. Parisi,et al.  Asymmetric neural networks and the process of learning , 1986 .

[17]  Pietro Perona,et al.  Is bottom-up attention useful for object recognition? , 2004, CVPR 2004.

[18]  Kun Guo,et al.  Consistent left gaze bias in processing different facial cues , 2012, Psychological research.

[19]  Norden E. Huang,et al.  On Instantaneous Frequency , 2009, Adv. Data Sci. Adapt. Anal..

[20]  Pietro Perona,et al.  Robust Face Landmark Estimation under Occlusion , 2013, 2013 IEEE International Conference on Computer Vision.

[21]  Terry G. Halwes,et al.  Right ear bias in the perception of loudness of pure tones , 1981, Neuropsychologia.

[22]  C. Koch,et al.  Computational modelling of visual attention , 2001, Nature Reviews Neuroscience.