DEEP NEAREST CLASS MEAN CLASSIFIERS

In this paper we introduce DeepNCM, a Nearest Class Mean classification method enhanced to directly learn highly non-linear deep (visual) representations of the data. To overcome the computational expensive process of recomputing the class means after every update of the representation, we opt for approximating the class means with an online estimate. Moreover, to allow the class means to follow closely the drifting representation we introduce per epoch mean condensation. Using online class means with condensation, DeepNCM can train efficiently on large datasets. Our (preliminary) experimental results indicate that DeepNCM performs on par with SoftMax optimised networks.

[1]  Andrew R. Webb,et al.  Statistical Pattern Recognition , 1999 .

[2]  Hinrich Schütze,et al.  Introduction to information retrieval , 2008 .

[3]  Alex Krizhevsky,et al.  Learning Multiple Layers of Features from Tiny Images , 2009 .

[4]  Gabriela Csurka,et al.  Distance-Based Image Classification: Generalizing to New Classes at Near-Zero Cost , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[6]  Oriol Vinyals,et al.  Matching Networks for One Shot Learning , 2016, NIPS.

[7]  Christoph H. Lampert,et al.  iCaRL: Incremental Classifier and Representation Learning , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[8]  Richard S. Zemel,et al.  Prototypical Networks for Few-shot Learning , 2017, NIPS.