Ultrasound-driven Curveball in Table Tennis

Augmented Human (AH) is a research field enhancing human physical abilities or supporting human activity using advanced technologies. As one of the AH approaches, previous studies have attached an actuator to a human body or tools used for an activity. The attached actuators are used to control their movements to support an activity. In this study, instead of attaching actuators, we propose to directly apply noncontact ultrasound force to a lightweight tool to manipulate it. The advantage of using noncontact force is that users do not need to wear a specific device and to process tools used for the activity. As a proof-of-concept system, we developed an ultrasound-based curveball system by which table tennis players can shoot a curveball regardless of their physical ability. In the system, a moving ping-pong ball (PPB) is a target tool for remote manipulation. The system curves the trajectory of a moving PPB by continuously focusing ultrasound on it. Users can control the curve timing and the curve direction (left or right) using a racket-shaped controller. In the user study, we conducted an actual table tennis match using the curveball system and qualitatively confirmed that the player using the system had the upper hand. Another user study using a ball dispenser quantitatively showed that the ultrasound-driven curveball increased the number of mistakes of the opponent player 2.95 times. These results indicate that the proposed concept is feasible.

[1]  Bruce W. Drinkwater,et al.  Holographic acoustic tweezers , 2018, Proceedings of the National Academy of Sciences.

[2]  Chi Thanh Vi,et al.  TastyFloats: A Contactless Food Delivery System , 2017, ISS.

[3]  Yasutoshi Makino,et al.  Three-Dimensional Manipulation of a Spherical Object Using Ultrasound Plane Waves , 2019, IEEE Robotics and Automation Letters.

[4]  H. Shinoda,et al.  Acoustic Macroscopic Rigid Body Levitation by Responsive Boundary Hologram , 2017, 1708.05988.

[5]  Sriram Subramanian,et al.  Ghost Touch: Turning Surfaces into Interactive Tangible Canvases with Focused Ultrasound , 2015, ITS.

[6]  K. Yosioka,et al.  Acoustic radiation pressure on a compressible sphere , 1955 .

[7]  Hong Kai Yap,et al.  A soft exoskeleton for hand assistive and rehabilitation application using pneumatic actuators with variable stiffness , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[8]  Yasutoshi Makino,et al.  Midair Haptic Presentation Using Concave Reflector , 2020, EuroHaptics.

[9]  Jun Rekimoto,et al.  Pixie dust: graphics generated by levitated and animated objects in computational acoustic-potential field , 2014, SIGGRAPH '14.

[10]  Masahiko Inami,et al.  Naviarm: Augmenting the Learning of Motor Skills using a Backpack-type Robotic Arm System , 2019, AH.

[11]  Yasutoshi Makino,et al.  Balloon Interface for Midair Haptic Interaction , 2020, SIGGRAPH 2020.

[12]  Atsushi Hiyama,et al.  Dynamic Motor Skill Synthesis with Human-Machine Mutual Actuation , 2020, CHI.

[13]  Yasuaki Kakehi,et al.  dePENd: augmented handwriting system using ferromagnetism of a ballpoint pen , 2013, UIST.

[14]  Jun Rekimoto,et al.  Three-Dimensional Mid-Air Acoustic Manipulation by Ultrasonic Phased Arrays , 2013, PloS one.

[15]  Karon E. MacLean,et al.  Phasking on Paper: Accessing a Continuum of PHysically Assisted SKetchING , 2020, CHI.

[16]  Sriram Subramanian,et al.  Ultra-tangibles: creating movable tangible objects on interactive tables , 2012, CHI.

[17]  Joseph A. Paradiso,et al.  PingPongPlus: design of an athletic-tangible interface for computer-supported cooperative play , 1999, CHI '99.

[18]  Yuta Sugiura,et al.  Graffiti fur: turning your carpet into a computer display , 2014, SIGGRAPH '14.

[19]  Ganesh Gowrishankar,et al.  Co-Limbs: An Intuitive Collaborative Control for Wearable Robotic Arms , 2019, SIGGRAPH ASIA Emerging Technologies.

[20]  R. R. Whymark,et al.  Acoustic field positioning for containerless processing , 1975 .

[21]  Hideki Koike,et al.  FuturePong: Real-time Table Tennis Trajectory Forecasting using Pose Prediction Network , 2020, CHI Extended Abstracts.

[22]  Gudrun Klinker,et al.  Laplacian Vision: Augmenting Motion Prediction via Optical See-Through Head-Mounted Displays , 2016, AH.

[23]  Yasutoshi Makino,et al.  AUTD3: Scalable Airborne Ultrasound Tactile Display , 2021, IEEE Transactions on Haptics.

[24]  Hiroyuki Shinoda,et al.  Non-contact Method for Producing Tactile Sensation Using Airborne Ultrasound , 2008, EuroHaptics.

[25]  Yasutoshi Makino,et al.  Acoustical boundary hologram for macroscopic rigid-body levitation. , 2019, The Journal of the Acoustical Society of America.

[26]  Gudrun Klinker,et al.  Laplacian vision: augmenting motion prediction via optical see-through head-mounted displays and projectors , 2016, SIGGRAPH Emerging Technologies.

[27]  Suranga Nanayakkara,et al.  I-Draw: towards a freehand drawing assistant , 2014, OZCHI.

[28]  Sriram Subramanian,et al.  Holographic acoustic elements for manipulation of levitated objects , 2015, Nature Communications.

[29]  Yuichi Itoh,et al.  SIGGRAPH ASIA 2016 Emerging Technologies , 2016, SIGGRAPH 2016.

[30]  Yuji Yamakawa,et al.  Hopping-Pong: Computational Curveball in Table Tennis by Noncontact Ultrasound Force , 2020, SIGGRAPH Emerging Technologies.

[31]  Jian Huang,et al.  Control of Upper-Limb Power-Assist Exoskeleton Using a Human-Robot Interface Based on Motion Intention Recognition , 2015, IEEE Transactions on Automation Science and Engineering.

[32]  Martin R. Gibbs,et al.  Evaluating a distributed physical leisure game for three players , 2007, OZCHI '07.

[33]  Asier Marzo,et al.  Acoustic Virtual Vortices with Tunable Orbital Angular Momentum for Trapping of Mie Particles. , 2018, Physical review letters.

[34]  Keita Higuchi,et al.  Shepherd pass: ability tuning for augmented sports using ball-shaped quadcopter , 2015, Advances in Computer Entertainment.

[35]  Yasutoshi Makino,et al.  Three-dimensional Interaction Technique Using an Acoustically Manipulated Balloon , 2019, SIGGRAPH ASIA Emerging Technologies.

[36]  Yuji Yamakawa,et al.  Hopping-Pong: Changing Trajectory of Moving Object Using Computational Ultrasound Force , 2019, ISS.

[37]  Timothy Bretl,et al.  Automated manipulation of spherical objects in three dimensions using a gimbaled air jet , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[38]  Sriram Subramanian,et al.  UltraHaptics: multi-point mid-air haptic feedback for touch surfaces , 2013, UIST.

[39]  Hiroshi Ishii,et al.  ZeroN: mid-air tangible interaction enabled by computer controlled magnetic levitation , 2011, UIST.

[40]  Kai Kunze,et al.  MetaArms: Body Remapping Using Feet-Controlled Artificial Arms , 2018, UIST.

[41]  Keita Higuchi,et al.  HoverBall: augmented sports with a flying ball , 2014, AH.

[42]  Hiroyuki Shinoda,et al.  Noncontact Tactile Display Based on Radiation Pressure of Airborne Ultrasound , 2010, IEEE Transactions on Haptics.

[43]  Yasutoshi Makino,et al.  Scalable Architecture for Airborne Ultrasound Tactile Display , 2016, AsiaHaptics.

[44]  Yasutoshi Makino,et al.  Reducing Amplitude Fluctuation by Gradual Phase Shift in Midair Ultrasound Haptics , 2020, IEEE Transactions on Haptics.

[45]  Martin R. Gibbs,et al.  Building a table tennis game for three players , 2007, ACE '07.

[46]  Takashi Ichikawa,et al.  TAMA: development of trajectory changeable ball for future entertainment , 2014, AH.

[47]  Thomas L. Hill,et al.  Acoustophoretic volumetric displays using a fast-moving levitated particle , 2019, Applied Physics Letters.