Deep Learning Training with Simulated Approximate Multipliers
暂无分享,去创建一个
[1] Dimitrios Soudris,et al. Approximate Hybrid High Radix Encoding for Energy-Efficient Inexact Multipliers , 2018, IEEE Transactions on Very Large Scale Integration (VLSI) Systems.
[2] Andrew Zisserman,et al. Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.
[3] Guigang Zhang,et al. Deep Learning , 2016, Int. J. Semantic Comput..
[4] Kaushik Roy,et al. Design of power-efficient approximate multipliers for approximate artificial neural networks , 2016, 2016 IEEE/ACM International Conference on Computer-Aided Design (ICCAD).
[5] Alex Krizhevsky,et al. Learning Multiple Layers of Features from Tiny Images , 2009 .
[6] Toshinori Sato,et al. Low-Power and High-Speed Approximate Multiplier Design with a Tree Compressor , 2017, 2017 IEEE International Conference on Computer Design (ICCD).
[7] Jason Cong,et al. Minimizing Computation in Convolutional Neural Networks , 2014, ICANN.
[8] Seok-Bum Ko,et al. Design of Power and Area Efficient Approximate Multipliers , 2017, IEEE Transactions on Very Large Scale Integration (VLSI) Systems.
[9] Jason Gu,et al. A Feature Descriptor Based on Local Normalized Difference for Real-World Texture Classification , 2018, IEEE Transactions on Multimedia.
[10] Weihong Deng,et al. Very deep convolutional neural network based image classification using small training sample size , 2015, 2015 3rd IAPR Asian Conference on Pattern Recognition (ACPR).
[11] Sherief Reda,et al. DRUM: A Dynamic Range Unbiased Multiplier for approximate applications , 2015, 2015 IEEE/ACM International Conference on Computer-Aided Design (ICCAD).
[12] Kamal El-Sankary,et al. Impact of Approximate Multipliers on VGG Deep Learning Network , 2018, IEEE Access.