Applying Transfer Learning to Recognize Clothing Patterns Using a Finger-Mounted Camera

Color identification tools do not identify visual patterns or allow users to quickly inspect multiple locations, which are both important for identifying clothing. We are exploring the use of a finger-based camera that allows users to query clothing colors and patterns by touch. Previously, we demonstrated the feasibility of this approach using a small, highly-controlled dataset and combining two image classification techniques commonly used for object recognition. Here, to improve scalability and robustness, we collect a dataset of fabric images from online sources and apply transfer learning to train an end-to-end deep neural network to recognize visual patterns. This new approach achieves 92% accuracy in a general case and 97% when tuned for images from a finger-mounted camera.

[1]  Alvin Jude,et al.  Gestures with speech for hand-impaired persons , 2014, ASSETS.

[2]  Manfred Tscheligi,et al.  Evaluating performance and acceptance of older adults using freehand gestures for TV menu control , 2012, EuroITV.

[3]  Jeffrey P. Bigham,et al.  Crowdsourcing subjective fashion advice using VizWiz: challenges and opportunities , 2012, ASSETS '12.

[4]  Shuai Yuan,et al.  Clothes Matching for Blind and Color Blind People , 2010, ICCHP.

[5]  Trevor Darrell,et al.  DeCAF: A Deep Convolutional Activation Feature for Generic Visual Recognition , 2013, ICML.

[6]  Wei Lu,et al.  Dynamic Hand Gesture Recognition With Leap Motion Controller , 2016, IEEE Signal Processing Letters.

[7]  Shuai Yuan,et al.  Clothing Matching for Visually Impaired Persons. , 2011, Technology and disability.

[8]  Jon Froehlich,et al.  Age-related differences in performance with touchscreens compared to traditional mouse input , 2013, CHI.

[9]  Faustina Hwang,et al.  How Do Novice Older Users Evaluate and Perform Mid-Air Gesture Interaction for the First Time? , 2016, NordiCHI.

[10]  Peter Gregor,et al.  The barriers that older novices encounter to computer use , 2010, Universal Access in the Information Society.

[11]  Michael S. Bernstein,et al.  ImageNet Large Scale Visual Recognition Challenge , 2014, International Journal of Computer Vision.

[12]  Stephen A. Brewster,et al.  Novel Multimodal Feedback Techniques for In-Car Mid-Air Gesture Interaction , 2017, AutomotiveUI.

[13]  Regan L. Mandryk,et al.  Full-body motion-based game interaction for older adults , 2012, CHI.

[14]  Sara J. Czaja,et al.  INFORMATION TECHNOLOGY AND OLDER ADULTS , 2007 .

[15]  Iasonas Kokkinos,et al.  Describing Textures in the Wild , 2013, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[16]  Christian Stößel,et al.  Familiarity as a factor in designing finger gestures for elderly users , 2009, Mobile HCI.

[17]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[18]  Jakob Nielsen,et al.  Gestural interfaces: a step backward in usability , 2010, INTR.

[19]  Faustina Hwang,et al.  An analysis of mid-air gestures used across three platforms , 2015, BCS HCI.

[20]  Shuai Yuan,et al.  Assistive Clothing Pattern Recognition for Visually Impaired People , 2014, IEEE Transactions on Human-Machine Systems.

[21]  Arun Garg,et al.  Dexterity as measured with the 9-Hole Peg Test (9-HPT) across the age span. , 2015, Journal of hand therapy : official journal of the American Society of Hand Therapists.

[22]  Chuan Chen,et al.  Recognizing Clothing Colors and Visual Textures Using a Finger-Mounted Camera: An Initial Investigation , 2017, ASSETS.

[23]  Jörg Müller,et al.  Cuenesics: using mid-air gestures to select items on interactive public displays , 2014, MobileHCI '14.

[24]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[25]  Catherine A. Sugar,et al.  Finding the Number of Clusters in a Dataset , 2003 .