bRIGHT - Workstations of the Future and Leveraging Contextual Models

Experimenting with futuristic computer workstation design and specifically tailored application models can yield useful insights and result in exciting ways to increase efficiency, effectiveness, and satisfaction for computer users. Designing and building a computer workstation that can track a user’s gaze; sense proximity to the touch surface; and support multi-touch, face recognition etc meant overcoming some unique technological challenges. Coupled with extensions to commonly used applications to report user interactions in a meaningful way, the workstation will allow the development of a rich contextual user model that is accurate enough to enable benefits, such as contextual filtering, task automation, contextual auto-fill, and improved understanding of team collaborations. SRI’s bRIGHT workstation was designed and built to explore these research avenues and investigate how such a context model can be built, identify the key implications in designing an application model that best serves these goals, and discover other related factors. This paper conjectures future research that would support the development of a collaborative context model that could leverage similar benefits for groups of users.

[1]  Pierre Nugues,et al.  Natural language programming of industrial robots , 2013, IEEE ISR 2013.

[2]  Lars Asplund,et al.  Intuitive industrial robot programming through incremental multimodal language and augmented reality , 2011, 2011 IEEE International Conference on Robotics and Automation.

[3]  Matthew Turk,et al.  Multimodal interaction: A review , 2014, Pattern Recognit. Lett..

[4]  Arne Jönsson,et al.  Wizard of Oz studies -- why and how , 1993, Knowl. Based Syst..

[5]  Frédo Durand,et al.  Learning to predict where humans look , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[6]  Peter D. Lawrence,et al.  Noncontact Binocular Eye-Gaze Tracking for Point-of-Gaze Estimation in Three Dimensions , 2009, IEEE Transactions on Biomedical Engineering.

[7]  Alois Knoll,et al.  Toward Efficient Robot Teach-in and Semantic Process Descriptions for Small Lot Sizes , 2015, RSS 2015.

[8]  Hafiz Adnan Habib,et al.  Infotainment devices control by eye gaze and gesture recognition fusion , 2008, IEEE Transactions on Consumer Electronics.

[9]  Fang Chen,et al.  Chapter 12 – Multimodal Input , 2010 .

[10]  Sharon L. Oviatt,et al.  Individual differences in multimodal integration patterns: what are they and why do they exist? , 2005, CHI.

[11]  Yang Bo,et al.  Using Trust Metric to Detect Malicious Behaviors in WSNs , 2007, Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD 2007).

[12]  Soo-Young Lee,et al.  Smart user interface for mobile consumer devices using model-based eye-gaze estimation , 2013, IEEE Transactions on Consumer Electronics.

[13]  Kristian Tylén,et al.  Interaction vs. observation: distinctive modes of social cognition in human brain and behavior? A combined fMRI and eye-tracking study , 2012, Front. Hum. Neurosci..

[14]  Gang Lu,et al.  Detecting Cumulated Anomaly by a Dubiety Degree based detection Model , 2007, Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD 2007).

[15]  Sayan Sarcar,et al.  EyeBoard: A fast and accurate eye gaze-based text entry system , 2012, 2012 4th International Conference on Intelligent Human Computer Interaction (IHCI).

[16]  Grit Denker,et al.  Towards More Effective Cyber Operator Interfaces Through Semantic Modeling of User Context , 2016 .

[17]  Sharon L. Oviatt,et al.  The efficiency of multimodal interaction: a case study , 1998, ICSLP.

[18]  Miguel Ángel Sotelo,et al.  Real-time system for monitoring driver vigilance , 2004, Proceedings of the IEEE International Symposium on Industrial Electronics, 2005. ISIE 2005..

[19]  Stephanie Cacioppo,et al.  Love Is in the Gaze , 2014, Psychological science.

[20]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[21]  Zhiwei Zhu,et al.  Novel Eye Gaze Tracking Techniques Under Natural Head Movement , 2007, IEEE Transactions on Biomedical Engineering.

[22]  Peter M. Corcoran,et al.  Real-time eye gaze tracking for gaming design and consumer electronics systems , 2012, IEEE Transactions on Consumer Electronics.

[23]  Olga Kochukhova,et al.  Preverbal infants anticipate that food will be brought to the mouth: an eye tracking study of manual feeding and flying spoons. , 2010, Child development.

[24]  Sungho Jo,et al.  Electric wheelchair control using head pose free eye-gaze tracker , 2012 .

[25]  Zhiwei Zhu,et al.  Eye and gaze tracking for interactive graphic display , 2002, SMARTGRAPH '02.

[26]  Masaki Tomonaga,et al.  How chimpanzees look at pictures: a comparative eye-tracking study , 2009, Proceedings of the Royal Society B: Biological Sciences.

[27]  Jarmo Verho,et al.  A Wearable, Wireless Gaze Tracker with Integrated Selection Command Source for Human‐Computer Interaction , 2011, IEEE Transactions on Information Technology in Biomedicine.

[28]  Jacek Malec,et al.  Describing constraint-based assembly tasks in unstructured natural language , 2014 .

[29]  Peter D. Lawrence,et al.  Fixation Precision in High-Speed Noncontact Eye-Gaze Tracking , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[30]  Ying Huang,et al.  A non-contact eye-gaze tracking system for human computer interaction , 2007, 2007 International Conference on Wavelet Analysis and Pattern Recognition.

[31]  Hans-Werner Gellersen,et al.  Toward Mobile Eye-Based Human-Computer Interaction , 2010, IEEE Pervasive Computing.

[32]  Mark H. Johnson,et al.  The eye contact effect: mechanisms and development , 2009, Trends in Cognitive Sciences.

[33]  John Rae,et al.  Some Implications of Eye Gaze Behavior and Perception for the Design of Immersive Telecommunication Systems , 2011, 2011 IEEE/ACM 15th International Symposium on Distributed Simulation and Real Time Applications.

[34]  Kostas Karpouzis,et al.  Visual Focus of Attention in Non-calibrated Environments using Gaze Estimation , 2014, International Journal of Computer Vision.

[35]  Moshe Eizenman,et al.  General theory of remote gaze estimation using the pupil center and corneal reflections , 2006, IEEE Transactions on Biomedical Engineering.