Assessment and Benchmarking of XoNLI: a Natural Language Processing Interface for Industrial Exoskeletons

Industrial exoskeletons are a potential solution for reducing work-related musculoskeletal disorders during carrying or lifting tasks. Having sensors, electrical/pneumatic actuators, and control systems, active exoskeletons present a more versatile control system because it is possible to select different assistive strategies based on the performed task. From this perspective, human-machine interaction is required to safely open basic exoskeleton domains to the user and provide an adaptable setup system. This article presents the assessment and benchmarking of the novel XoLab Natural Language Interface, a voice user interface for interaction and configuration of industrial active exoskeletons. The evaluation of the novel interface was performed by 17 participants who completed the setup and operational activities while wearing the XoTrunk exoskeleton. The benchmark consisted of a comparison of the presented device with previous adaptable interfaces for the exoskeleton: the user command interface and the monitor system interface. The results showed that although the novel interface demonstrated a considerable lag in the time response, it was more attractive, stimulating and novel than the standard one. However, the standard interface obtained favourable results over the user command interface and the voice interface perspicuity and efficiency.

[1]  R. Gassert,et al.  Design and validation of a novel online platform to support the usability evaluation of wearable robotic devices , 2023, Wearable Technologies.

[2]  Jong Wook Kim,et al.  Robust Speech Recognition via Large-Scale Weak Supervision , 2022, ICML.

[3]  D. Caldwell,et al.  Measuring Anthopometric Fit for Exoskeletons: Methodologies and Preliminary Assessment , 2022, 2022 9th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob).

[4]  Olmo A. Moreno-Franco,et al.  Evaluation of the User Command Interface, an Adaptable Setup System for Industrial Exoskeletons , 2022, 2022 9th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob).

[5]  C. Natali,et al.  Improving the Efficacy of an Active Back-Support Exoskeleton for Manual Material Handling Using the Accelerometer Signal , 2022, 2022 9th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob).

[6]  D. Caldwell,et al.  Active and Passive Back-Support Exoskeletons: A Comparison in Static and Dynamic Tasks , 2022, IEEE Robotics and Automation Letters.

[7]  D. Caldwell,et al.  Shoulder-sideWINDER (Shoulder-side Wearable INDustrial Ergonomic Robot): Design and Evaluation of Shoulder Wearable Robot With Mechanisms to Compensate for Joint Misalignment , 2022, IEEE Transactions on Robotics.

[8]  S. Sénécal,et al.  Comparing the Effectiveness of Speech and Physiological Features in Explaining Emotional Responses during Voice User Interface Interactions , 2022, Applied Sciences.

[9]  Francesco Braghin,et al.  An assistive upper-limb exoskeleton controlled by multi-modal interfaces for severely impaired patients: development and experimental assessment , 2021, Robotics Auton. Syst..

[10]  D. Caldwell,et al.  Designing an Integrated Tool Set Framework for Industrial Exoskeletons , 2021, Biosystems & Biorobotics.

[11]  D. Caldwell,et al.  Lifting and Carrying: Do We Need Back-Support Exoskeleton Versatility? , 2021, Biosystems & Biorobotics.

[12]  Darwin G. Caldwell,et al.  A case study on occupational back-support exoskeletons versatility in lifting and carrying , 2021, PETRA.

[13]  Darwin G. Caldwell,et al.  Evaluation of an acceleration-based assistive strategy to control a back-support exoskeleton for manual material handling , 2021, Wearable Technologies.

[14]  Lorenzo Masia,et al.  A voice activated bi-articular exosuit for upper limb assistance during lifting tasks , 2020, Robotics Comput. Integr. Manuf..

[15]  J. Pino,et al.  Fairseq S2T: Fast Speech-to-Text Modeling with Fairseq , 2020, AACL.

[16]  Dmitry Ryumin,et al.  Medical exoskeleton "Remotion" with an intelligent control system: Modeling, implementation, and testing , 2020, Simul. Model. Pract. Theory.

[17]  A. Lizé,et al.  Improving the Usability of Voice User Interfaces: A New Set of Ergonomic Criteria , 2020, HCI.

[18]  B. Christensen,et al.  Using Natural Language Processing and Sentiment Analysis to Augment Traditional User-Centered Design: Development and Usability Study , 2020, JMIR mHealth and uHealth.

[19]  Alexey Karpov,et al.  Lower Limbs Exoskeleton Control System Based on Intelligent Human-Machine Interface , 2019, IDC.

[20]  Christine Murad,et al.  "I don't know what you're talking about, HALexa": the case for voice user interface guidelines , 2019, CUI.

[21]  Sotiris Makris,et al.  Towards seamless human robot collaboration: integrating multimodal interaction , 2019, The International Journal of Advanced Manufacturing Technology.

[22]  Elena De Momi,et al.  Acceleration-based Assistive Strategy to Control a Back-support Exoskeleton for Load Handling: Preliminary Evaluation , 2019, 2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR).

[23]  Jaydev P. Desai,et al.  Towards the development of a voice-controlled exoskeleton system for restoring hand function , 2019, 2019 International Symposium on Medical Robotics (ISMR).

[24]  Nicholas I. Fisher,et al.  Good and bad market research: A critical review of Net Promoter Score , 2018, Applied Stochastic Models in Business and Industry.

[25]  Jason Weston,et al.  Dialogue Natural Language Inference , 2018, ACL.

[26]  Andrea Costa,et al.  Multi-Modal Human-Machine Control Interfaces of Upper Limb Motorized Exoskeletons for Severely Impaired Patients , 2018, 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob).

[27]  Martin Schrepp,et al.  Construction of a Benchmark for the User Experience Questionnaire (UEQ) , 2017, Int. J. Interact. Multim. Artif. Intell..

[28]  Susan L. Hura,et al.  Usability testing of spoken conversational systems , 2017 .

[29]  Frank Krause,et al.  Exoskeletons for industrial application and their potential effects on physical work load , 2016, Ergonomics.

[30]  James R. Lewis,et al.  Investigating the psychometric properties of the Speech User Interface Service Quality questionnaire , 2015, International Journal of Speech Technology.

[31]  E. Krieg,et al.  Efficacy of the Revised NIOSH Lifting Equation to Predict Risk of Low-Back Pain Associated With Manual Lifting , 2014, Hum. Factors.

[32]  Manuel Pérez Cota,et al.  Efficient Measurement of the User Experience of Interactive Products. How to use the User Experience Questionnaire (UEQ).Example: Spanish Language Version , 2013, Int. J. Interact. Multim. Artif. Intell..

[33]  Lucila Ohno-Machado,et al.  Natural language processing: an introduction , 2011, J. Am. Medical Informatics Assoc..

[34]  João Soares de Oliveira Neto,et al.  An Empirical Approach for the Evaluation of Voice User Interfaces , 2010 .

[35]  Kazuo Kiguchi,et al.  Mechanical designs of active upper-limb exoskeleton robots: State-of-the-art and design difficulties , 2009, 2009 IEEE International Conference on Rehabilitation Robotics.

[36]  Martin Schrepp,et al.  Construction and Evaluation of a User Experience Questionnaire , 2008, USAB.

[37]  Sandra G. Hart,et al.  Nasa-Task Load Index (NASA-TLX); 20 Years Later , 2006 .

[38]  J. Norberto Pires Robot-by-voice: experiments on commanding an industrial robot using the human voice , 2005, Ind. Robot.

[39]  C Kamm,et al.  User Interfaces for voice applications , 1994 .

[40]  Khairul Anam,et al.  Active Exoskeleton Control Systems: State of the Art , 2012 .

[41]  Fernando Lyardet,et al.  Voice User Interface Design Patterns , 2006, EuroPLoP.

[42]  Erika Kindlund,et al.  Using a Single Usability Metric ( SUM ) to Compare the Usability of Competing Products , 2005 .