Deep neural network may suffer large variance in the task of small-sample classification. To overcome this problem, and motivated by that ensemble method is an effective way of reducing variance, this paper proposes a new ensemble method called Selective Snapshot Ensembling, which introduces a new selective initialization strategy into Snapshot Ensembling. Snapshot Ensembling, an ensemble method of neural networks, has been demonstrated its effectiveness on many datasets. The proposed method is able to discard some poor initializations of network parameters, and taking an example of cross-entropy loss, poor initializations here refer to those initializations which cannot make the network converge to a very small value of loss. In particular, we set a threshold of epochs in Snapshot Ensembling to monitor and select good initialization. Experimental results on two small-sample datasets, the LabelMe dataset and the 80 AI-Challeger dataset, demonstrate that the proposed ensemble method outperforms Snapshot Ensembling and other baseline methods.