A Statistician Teaches Deep Learning

[1]  Lawrence Carin,et al.  Bridging Maximum Likelihood and Adversarial Learning via α-Divergence , 2020, AAAI.

[2]  Tengyu Ma,et al.  Optimal Regularization Can Mitigate Double Descent , 2020, ICLR.

[3]  Robert B. Gramacy,et al.  Surrogates: Gaussian Process Modeling, Design, and Optimization for the Applied Sciences , 2020 .

[4]  Zeeshan Ahmed,et al.  Artificial intelligence with multi-functional machine learning platform development for better healthcare and precision medicine , 2020, Database J. Biol. Databases Curation.

[5]  David S. Melnick,et al.  International evaluation of an AI system for breast cancer screening , 2020, Nature.

[6]  Boaz Barak,et al.  Deep double descent: where bigger models and more data hurt , 2019, ICLR.

[7]  Fengqi You,et al.  Data Analytics and Machine Learning for Smart Process Manufacturing: Recent Advances and Perspectives in the Big Data Era , 2019, Engineering.

[8]  Wojciech M. Czarnecki,et al.  Grandmaster level in StarCraft II using multi-agent reinforcement learning , 2019, Nature.

[9]  Tiberiu T. Cocias,et al.  A survey of deep learning techniques for autonomous driving , 2019, J. Field Robotics.

[10]  Suman V. Ravuri,et al.  A Clinically Applicable Approach to Continuous Prediction of Future Acute Kidney Injury , 2019, Nature.

[11]  Qaisar Abbas,et al.  A comprehensive review of recent advances on deep vision systems , 2018, Artificial Intelligence Review.

[12]  G. Corrado,et al.  End-to-end lung cancer screening with three-dimensional deep learning on low-dose chest computed tomography , 2019, Nature Medicine.

[13]  A. Longtin Learning to generalize , 2019, eLife.

[14]  Xin Sun,et al.  Evaluation and accurate diagnoses of pediatric diseases using artificial intelligence , 2019, Nature Medicine.

[15]  Blaine Nelson,et al.  Adversarial machine learning , 2019, AISec '11.

[16]  Paris Perdikaris,et al.  Physics-Constrained Deep Learning for High-dimensional Surrogate Modeling and Uncertainty Quantification without Labeled Data , 2019, J. Comput. Phys..

[17]  Mikhail Belkin,et al.  Reconciling modern machine learning and the bias-variance trade-off , 2018, ArXiv.

[18]  Quoc V. Le,et al.  GPipe: Efficient Training of Giant Neural Networks using Pipeline Parallelism , 2018, NeurIPS.

[19]  Seyed-Mohsen Moosavi-Dezfooli,et al.  SparseFool: A Few Pixels Make a Big Difference , 2018, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[20]  Xiao Wang,et al.  Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units , 2018, NeurIPS.

[21]  Pierre Baldi,et al.  Development and Validation of a Deep Neural Network Model for Prediction of Postoperative In-hospital Mortality , 2018, Anesthesiology.

[22]  Levent Sagun,et al.  The jamming transition as a paradigm to understand the loss landscape of deep neural networks , 2018, Physical review. E.

[23]  N. Razavian,et al.  Classification and mutation prediction from non–small cell lung cancer histopathology images using deep learning , 2018, Nature Medicine.

[24]  Manisha Kaushal,et al.  Soft Computing based object detection and tracking approaches: State-of-the-Art survey , 2018, Appl. Soft Comput..

[25]  Ying Wah Teh,et al.  Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges , 2018, Expert Syst. Appl..

[26]  Cenk Bircanoglu,et al.  RecycleNet: Intelligent Waste Sorting Using Deep Neural Networks , 2018, 2018 Innovations in Intelligent Systems and Applications (INISTA).

[27]  C. Rudin,et al.  This looks like that: deep learning for interpretable image recognition , 2018, NeurIPS.

[28]  Atul Prakash,et al.  Robust Physical-World Attacks on Deep Learning Visual Classification , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[29]  Daniel Sonntag,et al.  A Survey on Deep Learning Toolkits and Libraries for Intelligent User Interfaces , 2018, ArXiv.

[30]  Philip S. Yu,et al.  An Introduction to Image Synthesis with Generative Adversarial Nets , 2018, ArXiv.

[31]  Guojun Wang,et al.  A deep inference learning framework for healthcare , 2018, Pattern Recognit. Lett..

[32]  Roi Naveiro,et al.  Adversarial classification: An adversarial risk analysis approach , 2018, Int. J. Approx. Reason..

[33]  Ilias Bilionis,et al.  Deep UQ: Learning deep neural network surrogate models for high dimensional uncertainty quantification , 2018, J. Comput. Phys..

[34]  Jeffrey Dean,et al.  Scalable and accurate deep learning with electronic health records , 2018, npj Digital Medicine.

[35]  O. Shamir,et al.  Size-Independent Sample Complexity of Neural Networks , 2017, COLT.

[36]  Christopher J. Shallue,et al.  Identifying Exoplanets with Deep Learning: A Five-planet Resonant Chain around Kepler-80 and an Eighth Planet around Kepler-90 , 2017, 1712.05044.

[37]  Andrew H. Beck,et al.  Diagnostic Assessment of Deep Learning Algorithms for Detection of Lymph Node Metastases in Women With Breast Cancer , 2017, JAMA.

[38]  Luís C. Neves,et al.  Application of machine learning for fuel consumption modelling of trucks , 2017, 2017 IEEE International Conference on Big Data (Big Data).

[39]  Xiangru Li,et al.  Some optimizations on detecting gravitational wave using convolutional neural network , 2017 .

[40]  Kouichi Sakurai,et al.  One Pixel Attack for Fooling Deep Neural Networks , 2017, IEEE Transactions on Evolutionary Computation.

[41]  B. C. Loh,et al.  Deep learning for cardiac computer-aided diagnosis: benefits, issues & solutions. , 2017, mHealth.

[42]  Andrew M. Saxe,et al.  High-dimensional dynamics of generalization error in neural networks , 2017, Neural Networks.

[43]  Klaus-Robert Müller,et al.  Explainable Artificial Intelligence: Understanding, Visualizing and Interpreting Deep Learning Models , 2017, ArXiv.

[44]  Md. Zakirul Alam Bhuiyan,et al.  A Survey on Deep Learning in Big Data , 2017, 22017 IEEE International Conference on Computational Science and Engineering (CSE) and IEEE International Conference on Embedded and Ubiquitous Computing (EUC).

[45]  Matus Telgarsky,et al.  Spectrally-normalized margin bounds for neural networks , 2017, NIPS.

[46]  Muhammad Imran Razzak,et al.  Deep Learning for Medical Image Processing: Overview, Challenges and Future , 2017, ArXiv.

[47]  Aurélien Géron,et al.  Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems , 2017 .

[48]  Aaron C. Courville,et al.  Improved Training of Wasserstein GANs , 2017, NIPS.

[49]  Zoubin Ghahramani,et al.  Deep Bayesian Active Learning with Image Data , 2017, ICML.

[50]  Sebastian Thrun,et al.  Dermatologist-level classification of skin cancer with deep neural networks , 2017, Nature.

[51]  Daniel George,et al.  Deep Neural Networks to Enable Real-time Multimessenger Astrophysics , 2016, ArXiv.

[52]  Subhashini Venugopalan,et al.  Development and Validation of a Deep Learning Algorithm for Detection of Diabetic Retinopathy in Retinal Fundus Photographs. , 2016, JAMA.

[53]  André Luckow,et al.  2016 Ieee International Conference on Big Data (big Data) Deep Learning in the Automotive Industry: Applications and Tools , 2022 .

[54]  George E. Sakr,et al.  Comparing deep learning and support vector machines for autonomous waste sorting , 2016, 2016 IEEE International Multidisciplinary Conference on Engineering Technology (IMCET).

[55]  Seyed-Mohsen Moosavi-Dezfooli,et al.  Universal Adversarial Perturbations , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[56]  Iasonas Kokkinos,et al.  DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[57]  Christopher D. Manning,et al.  Compression of Neural Machine Translation Models via Pruning , 2016, CoNLL.

[58]  Diego Giuliani,et al.  Deep-neural network approaches for speech recognition with heterogeneous groups of speakers including children† , 2016, Natural Language Engineering.

[59]  Christopher D. Manning,et al.  Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character Models , 2016, ACL.

[60]  Li Wang,et al.  Deep Learning Algorithms with Applications to Video Analytics for A Smart City: A Survey , 2015, ArXiv.

[61]  Sandeep Paul,et al.  A review on advances in deep learning , 2015, 2015 IEEE Workshop on Computational Intelligence: Theories, Applications and Future Directions (WCI).

[62]  Christopher D. Manning,et al.  Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.

[63]  Zhizheng Wu,et al.  A study of speaker adaptation for DNN-based speech synthesis , 2015, INTERSPEECH.

[64]  Tetsuya Ogata,et al.  Audio-visual speech recognition using deep learning , 2014, Applied Intelligence.

[65]  Al Sweigart,et al.  Automate the Boring Stuff with Python: Practical Programming for Total Beginners , 2015 .

[66]  Yunpeng Wang,et al.  Large-Scale Transportation Network Congestion Evolution Prediction Using Deep Learning Theory , 2015, PloS one.

[67]  Dumitru Erhan,et al.  Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[68]  Shai Ben-David,et al.  Understanding Machine Learning: From Theory to Algorithms , 2014 .

[69]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[70]  D. Ruppert The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2004 .

[71]  V. Koltchinskii,et al.  Rademacher Processes and Bounding the Risk of Function Learning , 2004, math/0405338.

[72]  Peter L. Bartlett,et al.  Rademacher and Gaussian Complexities: Risk Bounds and Structural Results , 2003, J. Mach. Learn. Res..

[73]  Vladimir Koltchinskii,et al.  Rademacher penalties and structural risk minimization , 2001, IEEE Trans. Inf. Theory.

[74]  Peter L. Bartlett,et al.  Model Selection and Error Estimation , 2000, Machine Learning.

[75]  Martin T. Hagan,et al.  Neural network design , 1995 .

[76]  Allan Pinkus,et al.  Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function , 1991, Neural Networks.

[77]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[78]  H. Robbins A Stochastic Approximation Method , 1951 .

[79]  Andrey Ignatov,et al.  Real-time human activity recognition from accelerometer data using Convolutional Neural Networks , 2018, Appl. Soft Comput..

[80]  G. Biroli,et al.  A jamming transition from under- to over-parametrization affects loss landscape and generalization , 2018, ArXiv.

[81]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[82]  Amit Patel,et al.  Real Time Strategy Games: A Reinforcement Learning Approach , 2015 .

[83]  Guojie Li,et al.  An Algorithm for Power System Fault Analysis based on Convolutional Deep Learning Neural Networks , 2012 .

[84]  Allen B. Downey,et al.  How to think like a computer scientist: Learning with Python , 2002 .