Pruning convolution neural network (squeezenet) using taylor expansion-based criterion

Recent research in the field of deep learning focuses on reducing the model size of the Convolution Neural Network (CNN) by various compression techniques like Pruning, Quantization and Encoding (eg. Huffman encoding). This paper proposes a way to prune the CNN based on Taylor expansion of change in cost function ΔC of the model. The proposed algorithm uses greedy criteria based pruning with fine-tuning by backpropagation on SqueezeNet architecture. Transfer learning technique is used to train the SqueezeNet on the CIFAR-10 dataset. The proposed algorithm achieves 70% model reduction on SqueezeNet architecture with only 1% drop in accuracy.