ARPads: Mid-air Indirect Input for Augmented Reality
暂无分享,去创建一个
Olivier Chapuis | Caroline Appert | Nicolas Ferey | Eugenie Brasier | Jeanne Vezien | O. Chapuis | Caroline Appert | J. Vézien | N. Férey | Eugenie Brasier
[1] Kurt Hornik,et al. Implementing a Class of Permutation Tests: The coin Package , 2008 .
[2] Raimund Dachselt,et al. Look & touch: gaze-supported target acquisition , 2012, CHI.
[3] Frederick P. Brooks,et al. Moving objects in space: exploiting proprioception in virtual-environment interaction , 1997, SIGGRAPH.
[4] M. Sheelagh T. Carpendale,et al. A comparison of ray pointing techniques for very large displays , 2010, Graphics Interface.
[5] Mark Billinghurst,et al. Grasp-Shell vs gesture-speech: A comparison of direct and indirect natural interaction techniques in augmented reality , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).
[6] Antti Oulasvirta,et al. Performance and Ergonomics of Touch Surfaces: A Comparative Study using Biomechanical Simulation , 2015, CHI.
[7] Sebastian Günther,et al. Mind the Tap: Assessing Foot-Taps for Interacting with Head-Mounted Displays , 2019, CHI.
[8] Enrico Rukzio,et al. PocketThumb: a Wearable Dual-Sided Touch Interface for Cursor-based Control of Smart-Eyewear , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..
[9] Andrew S. Forsberg,et al. Image plane interaction techniques in 3D immersive environments , 1997, SI3D.
[10] Tovi Grossman,et al. Candid Interaction: Revealing Hidden Mobile and Wearable Computing Activities , 2015, UIST.
[11] Maribeth Gandy Coleman,et al. The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.
[12] Hui-Shyong Yeo,et al. Counterpoint: Exploring Mixed-Scale Gesture Interaction for AR Applications , 2018, CHI Extended Abstracts.
[13] Hemant Bhaskar Surale,et al. TabletInVR: Exploring the Design Space for Using a Multi-Touch Tablet in Virtual Reality , 2019, CHI.
[14] Dong-Bach Vo,et al. Belly gestures: body centric gestures on the abdomen , 2014, NordiCHI.
[15] Keiko Katsuragawa,et al. Understanding Viewport- and World-based Pointing with Everyday Smart Devices in Immersive Augmented Reality , 2020, CHI.
[16] Christophe Hurter,et al. On-Body Tangible Interaction: Using the Body to Support Tangible Manipulations for Immersive Environments , 2019, INTERACT.
[17] Olivier Chapuis,et al. Impact of semantic aids on command memorization for on-body interaction and directional gestures , 2018, AVI.
[18] Dieter Schmalstieg,et al. MultiFi: Multi Fidelity Interaction with Displays On and Around the Body , 2015, CHI.
[19] Jonathan D. Cohen,et al. Rubber hands ‘feel’ touch that eyes see , 1998, Nature.
[20] Y. Guiard. Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.
[21] Michael J. McGuffin,et al. DualCAD: Integrating Augmented Reality with a Desktop GUI and Smartphone Interaction , 2016, 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct).
[22] Dieter Schmalstieg,et al. TrackCap: Enabling Smartphones for 3D Interaction on Mobile Head-Mounted Displays , 2019, CHI.
[23] Robert W. Lindeman,et al. Exploring natural eye-gaze-based interaction for immersive virtual reality , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).
[24] Olivier Chapuis,et al. Effects of motor scale, visual scale, and quantization on small target acquisition difficulty , 2011, TCHI.
[25] Shumin Zhai,et al. Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.
[26] Marcos Serrano,et al. Exploring the use of hand-to-face input for interacting with head-worn displays , 2014, CHI.
[27] Pattie Maes,et al. SixthSense: a wearable gestural interface , 2009, SIGGRAPH ASIA Art Gallery & Emerging Technologies.
[28] Saul Greenberg,et al. The Continuous Interaction Space: Interaction Techniques Unifying Touch and Gesture on and above a Digital Surface , 2011, INTERACT.
[29] Tovi Grossman,et al. BISHARE: Exploring Bidirectional Interactions Between Smartphones and Head-Mounted Augmented Reality , 2020, CHI.
[30] Hans-Werner Gellersen,et al. Gaze + pinch interaction in virtual reality , 2017, SUI.
[31] Geehyuk Lee,et al. Typing on a Smartwatch for Smart Glasses , 2017, ISS.
[32] Robert J. K. Jacob,et al. Eye tracking in advanced interface design , 1995 .
[33] Tovi Grossman,et al. Supporting Subtlety with Deceptive Devices and Illusory Interactions , 2015, CHI.
[34] C. Ware,et al. An evaluation of an eye tracker as a device for computer input2 , 1987, CHI '87.
[35] Jörg Müller,et al. Ownershift: Facilitating Overhead Interaction in Virtual Reality with an Ownership-Preserving Hand Space Shift , 2018, UIST.
[36] Mark Billinghurst,et al. Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.
[37] Ian Oakley,et al. Designing Socially Acceptable Hand-to-Face Input , 2018, UIST.
[38] Pourang Irani,et al. Consumed endurance: a metric to quantify arm fatigue of mid-air interactions , 2014, CHI.
[39] Xiang 'Anthony' Chen,et al. Air+touch: interweaving touch & in-air gestures , 2014, UIST.
[40] Florian Alt,et al. VRpursuits: interaction in virtual reality using smooth pursuit eye movements , 2018, AVI.
[41] Daniel Vogel,et al. Gunslinger: Subtle Arms-down Mid-air Interaction , 2015, UIST.
[42] Ivan Poupyrev,et al. The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.
[43] Edward Lank,et al. Exploring At-Your-Side Gestural Interaction for Ubiquitous Environments , 2017, Conference on Designing Interactive Systems.
[44] Per Ola Kristensson,et al. Performance Envelopes of in-Air Direct and Smartwatch Indirect Control for Head-Mounted Augmented Reality , 2018, 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).
[45] Won-Ki Jeong,et al. DXR: A Toolkit for Building Immersive Data Visualizations , 2019, IEEE Transactions on Visualization and Computer Graphics.
[46] Laurent Grisoni,et al. E-Pad: Large Display Pointing in a Continuous Interaction Space around a Mobile Device , 2019, Conference on Designing Interactive Systems.
[47] Steven K. Feiner,et al. WeARHand: Head-worn, RGB-D camera-based, bare-hand user interface with visually enhanced depth perception , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).
[48] Olivier Chapuis,et al. Mid-air pan-and-zoom on wall-sized displays , 2011, CHI.
[49] Wilko Heuten,et al. All about Acceptability?: Identifying Factors for the Adoption of Data Glasses , 2017, CHI.
[50] Pejman Mirza-Babaei,et al. More than Meets the Eye: The Benefits of Augmented Reality and Holographic Displays for Digital Cultural Heritage , 2017, JOCCH.
[51] Vivek K. Goyal,et al. Mime: compact, low power 3D gesture sensing for interaction with head mounted displays , 2013, UIST.
[52] Florian Müller,et al. Palm-based Interaction with Head-mounted Displays , 2015, MobileHCI Adjunct.
[53] Geehyuk Lee,et al. Designing Touch Gestures Using the Space around the Smartwatch as Continuous Input Space , 2017, ISS.
[54] Chris Harrison,et al. Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices , 2009, UIST '09.
[55] Michael J. McGuffin,et al. Enlarging a Smartphone with AR to Create a Handheld VESAD (Virtually Extended Screen-Aligned Display) , 2018, 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).
[56] Edward Lank,et al. Watchcasting: Freehand 3D interaction with off-the-shelf smartwatch , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).
[57] Michael Rohs,et al. ShoeSense: a new perspective on gestural interaction and wearable applications , 2012, CHI.
[58] Pourang Irani,et al. Are you comfortable doing that?: acceptance studies of around-device gestures in and for public settings , 2014, MobileHCI '14.
[59] Robert J. K. Jacob,et al. What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.
[60] Kent Lyons,et al. The Gesture Watch: A Wireless Contact-free Gesture based Wrist Interface , 2007, 2007 11th IEEE International Symposium on Wearable Computers.
[61] Desney S. Tan,et al. Skinput: appropriating the body as an input surface , 2010, CHI.
[62] Yuta Sugiura,et al. CheekInput: turning your cheek into an input surface by embedded optical sensors on a head-mounted display , 2017, VRST.
[63] R Core Team,et al. R: A language and environment for statistical computing. , 2014 .
[64] Enrico Rukzio,et al. Belt: An Unobtrusive Touch Input Device for Head-worn Displays , 2015, CHI.
[65] E. C. Poulton,et al. Tracking skill and manual control , 1974 .
[66] Hadley Wickham,et al. ggplot2 - Elegant Graphics for Data Analysis (2nd Edition) , 2017 .
[67] Edward Lank,et al. Pointing at a Distance with Everyday Smart Devices , 2018, CHI.
[68] Patrick Baudisch,et al. Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.
[69] Jérémy Frey,et al. Pointing in Spatial Augmented Reality from 2D Pointing Devices , 2015, INTERACT.
[70] Sriram Subramanian,et al. Erg-O: Ergonomic Optimization of Immersive Virtual Environments , 2017, UIST.
[71] Mark Billinghurst,et al. Using a HHD with a HMD for mobile AR interaction , 2013, ISMAR.
[72] Per Ola Kristensson,et al. Text Entry in Immersive Head-Mounted Display-Based Virtual Reality Using Standard Keyboards , 2018, 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).
[73] Yanxia Zhang,et al. Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze , 2015, UIST.