Mixed Reality Stock Trading Visualization System

In this paper, we present a novel mixed reality system for supporting stock market trading. The system is designed to enhance traders’ working environment by displaying an array of virtual screens visualizing financial stock data and related news feeds within the user’s surroundings. We combined the nVisor ST50 headset with InteriaCube4 and Leap Motion devices to enable tracking of head orientation and controlling the VR/AR environment with hands. Users can create and control the virtual screens directly using their hands in 3D space.

[1]  Shana Smith,et al.  A virtual reality keyboard with realistic haptic feedback in a fully immersive virtual environment , 2017, Virtual Reality.

[2]  Krzysztof Walczak,et al.  Visualizing Web search results in 3D , 2004, Computer.

[3]  Mark Billinghurst,et al.  Do You See What I See? The Effect of Gaze Tracking on Task Space Remote Collaboration , 2016, IEEE Transactions on Visualization and Computer Graphics.

[4]  Krzysztof Walczak,et al.  An architecture for distributed semantic augmented reality services , 2018, Web3D.

[5]  Peng Song,et al.  A handle bar metaphor for virtual object manipulation with mid-air interaction , 2012, CHI.

[6]  Pier Paolo Valentini,et al.  Accuracy in fingertip tracking using Leap Motion Controller for interactive virtual applications , 2017 .

[7]  Bruno Fernandes,et al.  Bare hand interaction in tabletop augmented reality , 2009, SIGGRAPH '09.

[8]  Adrian David Cheok,et al.  An experimental study on the role of software synthesized 3D sound in augmented reality environments , 2004, Interact. Comput..

[9]  Tobias Höllerer,et al.  Multimodal interaction with a wearable augmented reality system , 2006, IEEE Computer Graphics and Applications.

[10]  Gunther Heidemann,et al.  Multimodal interaction in an augmented reality scenario , 2004, ICMI '04.

[11]  Krzysztof Walczak,et al.  Semantic contextual augmented reality environments , 2014, ISMAR.

[12]  Krzysztof Walczak,et al.  Building Contextual Augmented Reality Environments with semantics , 2014, 2014 International Conference on Virtual Systems & Multimedia (VSMM).

[13]  Steven K. Feiner,et al.  Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality , 2003, ICMI '03.

[14]  Mark Billinghurst,et al.  Grasp-Shell vs gesture-speech: A comparison of direct and indirect natural interaction techniques in augmented reality , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[15]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[16]  Dariusz Ruminski,et al.  An experimental study of spatial sound usefulness in searching and navigating through AR environments , 2015, Virtual Reality.

[17]  Robert W. Lindeman,et al.  Evaluating the Effects of Hand-gesture-based Interaction with Virtual Content in a 360° Movie , 2017, ICAT-EGVE.

[18]  Krzysztof Walczak,et al.  Customization of 3D content with semantic meta-scenes , 2016, Graph. Model..

[19]  Tobias Höllerer,et al.  Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[20]  Hirokazu Kato,et al.  Collaborative augmented reality , 2002, CACM.

[21]  Krzysztof Walczak,et al.  X-VRML-XML based modeling of virtual reality , 2002, Proceedings 2002 Symposium on Applications and the Internet (SAINT 2002).

[22]  Lucio Tommaso De Paolis,et al.  Touchless Interaction for Command and Control in Military Operations , 2015, AVR.

[23]  Mark Billinghurst,et al.  Spatial sound localization in an augmented reality environment , 2006, OZCHI.

[24]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.