Multi-class Multi-label Classification for Cooking Activity Recognition
暂无分享,去创建一个
[1] Aurélien Géron,et al. Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems , 2017 .
[2] Bernt Schiele,et al. Recognizing Fine-Grained and Composite Activities Using Hand-Centric Features and Script Data , 2015, International Journal of Computer Vision.
[3] Akane Sano,et al. Multimodal autoencoder: A deep learning approach to filling in missing sensor data and enabling better mood prediction , 2017, 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII).
[4] Sozo Inoue,et al. Achieving Single-Sensor Complex Activity Recognition from Multi-Sensor Training Data , 2020, ArXiv.
[5] Bernt Schiele,et al. A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.
[6] Thomas Plötz,et al. Ensembles of Deep LSTM Learners for Activity Recognition using Wearables , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..
[7] Jessica K. Hodgins,et al. Guide to the Carnegie Mellon University Multimodal Activity (CMU-MMAC) Database , 2008 .
[8] S. Kriemler,et al. Accelerometer-derived physical activity estimation in preschoolers – comparison of cut-point sets incorporating the vector magnitude vs the vertical axis , 2019, BMC Public Health.
[9] Andriy Burkov,et al. The Hundred-Page Machine Learning Book , 2019 .
[10] Nitesh V. Chawla,et al. Imputing Missing Social Media Data Stream in Multisensor Studies of Human Behavior , 2019, 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII).
[11] VALENTIN RADU,et al. Multimodal Deep Learning for Activity and Context Recognition , 2018, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..
[12] Majid Mirmehdi,et al. Recognition of unscripted kitchen activities and eating behaviour for health monitoring , 2016 .
[13] Majid Mirmehdi,et al. What's cooking and why? Behaviour recognition during unscripted cooking tasks for health monitoring , 2017, 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops).
[14] Moritz Tenorth,et al. The TUM Kitchen Data Set of everyday manipulation activities for motion tracking and action recognition , 2009, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops.
[15] Patrick Olivier,et al. A Dynamic Time Warping Approach to Real-Time Activity Recognition for Food Preparation , 2010, AmI.
[16] Md. Nasir Sulaiman,et al. MULTI-LABEL CLASSIFICATION FOR PHYSICAL ACTIVITY RECOGNITION FROM VARIOUS ACCELEROMETER SENSOR POSITIONS , 2018 .
[17] Majid Mirmehdi,et al. Analysing Cooking Behaviour in Home Settings: Towards Health Monitoring † , 2019, Sensors.
[18] Bernt Schiele,et al. An Analysis of Sensor-Oriented vs. Model-Based Activity Recognition , 2009, 2009 International Symposium on Wearable Computers.
[19] Sozo Inoue,et al. A dataset for complex activity recognition withmicro and macro activities in a cooking scenario , 2020, ArXiv.
[20] Petia Radeva,et al. Food Ingredients Recognition Through Multi-label Learning , 2017, ICIAP Workshops.
[21] Paul Lukowicz,et al. Collecting complex activity datasets in highly rich networked sensor environments , 2010, 2010 Seventh International Conference on Networked Sensing Systems (INSS).
[22] Franceska Xhakaj,et al. EduSense: Practical Classroom Sensing at Scale , 2019, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..
[23] Max Mühlhäuser,et al. Capturing Daily Student Life by Recognizing Complex Activities Using Smartphones , 2017, MobiQuitous.
[24] Jani Bizjak,et al. A New Frontier for Activity Recognition: The Sussex-Huawei Locomotion Challenge , 2018, UbiComp/ISWC Adjunct.
[25] Bernt Schiele,et al. A database for fine grained activity detection of cooking activities , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.