暂无分享,去创建一个
[1] Luke Melas-Kyriazi,et al. Do You Even Need Attention? A Stack of Feed-Forward Layers Does Surprisingly Well on ImageNet , 2021, ArXiv.
[2] Alexander Kolesnikov,et al. MLP-Mixer: An all-MLP Architecture for Vision , 2021, NeurIPS.
[3] Dmitry Krotov,et al. Large Associative Memory Problem in Neurobiology and Machine Learning , 2020, ArXiv.
[4] John J. Hopfield,et al. Dense Associative Memory for Pattern Recognition , 2016, NIPS.
[5] Geir Kjetil Sandve,et al. Hopfield Networks is All You Need , 2020, ArXiv.