暂无分享,去创建一个
Ponnuthurai N. Suganthan | M. Tanveer | M. A. Ganaie | Minghui Hu | P. Suganthan | Minghui Hu | M. Tanveer
[1] M. A. Ganaie,et al. Random vector functional link network: recent developments, applications, and future directions , 2022, Appl. Soft Comput..
[2] M. A. Ganaie,et al. Ensemble deep random vector functional link network using privileged information for Alzheimer's disease diagnosis. , 2022, IEEE/ACM transactions on computational biology and bioinformatics.
[3] P. Suganthan,et al. Representation learning using deep random vector functional link networks for clustering , 2022, Pattern Recognit..
[4] B. Nallamothu,et al. Vessel segmentation for X-ray coronary angiography using ensemble methods with deep learning and filter-based features , 2022, BMC Medical Imaging.
[5] A. Taguchi,et al. Identification of osteoporosis using ensemble deep learning model with panoramic radiographs and clinical covariates , 2021, Scientific Reports.
[6] M. A. Ganaie,et al. Classification of Alzheimer’s Disease Using Ensemble of Deep Neural Networks Trained Through Transfer Learning , 2021, IEEE Journal of Biomedical and Health Informatics.
[7] S. Satapathy,et al. Improved heart disease detection from ECG signal using deep learning based ensemble model , 2022, Sustain. Comput. Informatics Syst..
[8] Sumit Saroha,et al. An ensemble method to forecast 24-h ahead solar irradiance using wavelet decomposition and BiLSTM deep learning network , 2021, Earth Science Informatics.
[9] Hari Mohan Rai,et al. Hybrid CNN-LSTM deep learning model and ensemble technique for automatic detection of myocardial infarction using big ECG data , 2021, Applied Intelligence.
[10] Meikang Qiu,et al. Intelligent Fault Diagnosis by Fusing Domain Adversarial Training and Maximum Mean Discrepancy via Ensemble Learning , 2021, IEEE Transactions on Industrial Informatics.
[11] Erik Elmroth,et al. DeL-IoT: A deep ensemble learning approach to uncover anomalies in IoT , 2021, Internet Things.
[12] Shamik Sengupta,et al. Deep Ensemble Learning-based Approach to Real-time Power System State Estimation , 2021, International Journal of Electrical Power & Energy Systems.
[13] Zhuo Wang,et al. Deep ensemble neural-like P systems for segmentation of central serous chorioretinopathy lesion , 2021, Inf. Fusion.
[14] Diego Reforgiato Recupero,et al. A multi-layer and multi-ensemble stock trader using deep learning and deep reinforcement learning , 2020, Applied Intelligence.
[15] Amit Kumar Das,et al. Automatic COVID-19 detection from X-ray images using ensemble learning with convolutional neural network , 2020, Pattern Analysis and Applications.
[16] Yun Yang,et al. Multi-label classification with weighted classifier selection and stacked ensemble , 2020, Inf. Sci..
[17] Ming-Ming Cheng,et al. Nonlinear Regression via Deep Negative Correlation Learning , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[18] P. N. Suganthan,et al. Random Vector Functional Link Neural Network based Ensemble Deep Learning , 2019, Pattern Recognit..
[19] Sam Kwong,et al. Active k-labelsets ensemble for multi-label classification , 2021, Pattern Recognit..
[20] Xinyu Li,et al. A new ensemble convolutional neural network with diversity regularization for fault diagnosis , 2020, Journal of Manufacturing Systems.
[21] Zhiwei Xiong,et al. Real-World Image Denoising with Deep Boosting , 2020, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[22] Francisco Herrera,et al. A practical tutorial on bagging and boosting based ensembles for machine learning: Algorithms, software tools, performance study, practical perspectives and opportunities , 2020, Inf. Fusion.
[23] Ahmad Shalbaf,et al. Automated detection of COVID-19 using ensemble of transfer learning with deep convolutional neural network based on CT scans , 2020, International Journal of Computer Assisted Radiology and Surgery.
[24] Shi Qiu,et al. The ensemble deep learning model for novel COVID-19 on CT images , 2020, Applied Soft Computing.
[25] Sebastian Buschjäger,et al. Generalized Negative Correlation Learning for Deep Ensembling , 2020, ArXiv.
[26] Xiao-Yang Liu,et al. Deep reinforcement learning for automated stock trading: an ensemble strategy , 2020, ICAIF.
[27] Ping Chen,et al. A Novel Deep Learning Model by Stacking Conditional Restricted Boltzmann Machine and Deep Neural Network , 2020, KDD.
[28] Thomas A. Geddes,et al. Ensemble deep learning in bioinformatics , 2020, Nature Machine Intelligence.
[29] S. M. Riazul Islam,et al. A smart healthcare monitoring system for heart disease prediction based on ensemble deep learning and feature fusion , 2020, Inf. Fusion.
[30] Wei Zhang,et al. Grasp for Stacking via Deep Reinforcement Learning , 2020, 2020 IEEE International Conference on Robotics and Automation (ICRA).
[31] M. M. Ruiz,et al. MNIST-NET10: A heterogeneous deep networks fusion based on the degree of certainty to reach 0.1% error rate. Ensembles overview and proposal , 2020, Inf. Fusion.
[32] Mengjie Zhang,et al. Particle Swarm optimisation for Evolving Deep Neural Networks for Image Classification by Evolving and Stacking Transferable Blocks , 2019, 2020 IEEE Congress on Evolutionary Computation (CEC).
[33] Andrew Beng Jin Teoh,et al. Stacking-Based Deep Neural Network: Deep Analytic Network for Pattern Classification , 2018, IEEE Transactions on Cybernetics.
[34] Zhiwen Yu,et al. Semi-Supervised Deep Coupled Ensemble Learning With Classification Landmark Exploration , 2020, IEEE Transactions on Image Processing.
[35] B. Cui,et al. Snapshot boosting: a fast ensemble framework for deep neural networks , 2019, Science China Information Sciences.
[36] David Camacho,et al. Android malware detection through hybrid features fusion and ensemble classifiers: The AndroPyTool framework and the OmniDroid dataset , 2019, Inf. Fusion.
[37] Daisuke Kihara,et al. EnAET: Self-Trained Ensemble AutoEncoding Transformations for Semi-Supervised Learning , 2019, ArXiv.
[38] Ali Aghagolzadeh,et al. Ensemble of CNN for multi-focus image fusion , 2019, Inf. Fusion.
[39] Bo Liu,et al. Unsupervised Ensemble Strategy for Retinal Vessel Segmentation , 2019, MICCAI.
[40] P. Suganthan,et al. Stacked Autoencoder Based Deep Random Vector Functional Link Neural Network for Classification , 2019, Appl. Soft Comput..
[41] Krzysztof J. Cios,et al. An evolutionary approach to build ensembles of multi-label classifiers , 2019, Inf. Fusion.
[42] Biswajeet Pradhan,et al. Novel Hybrid Integration Approach of Bagging-Based Fisher’s Linear Discriminant Function for Groundwater Potential Analysis , 2019, Natural Resources Research.
[43] Chapman Siu,et al. Residual Networks Behave Like Boosting Algorithms , 2019, 2019 IEEE International Conference on Data Science and Advanced Analytics (DSAA).
[44] Xin Li,et al. A novel deep stacking least squares support vector machine for rolling bearing fault diagnosis , 2019, Comput. Ind..
[45] Jihie Kim,et al. Ensemble-Based Deep Reinforcement Learning for Chatbots , 2019, Neurocomputing.
[46] Hong Wen,et al. Adaboost-based security level classification of mobile intelligent terminals , 2019, The Journal of Supercomputing.
[47] Kup-Sze Choi,et al. Deep Additive Least Squares Support Vector Machines for Classification With Model Transfer , 2019, IEEE Transactions on Systems, Man, and Cybernetics: Systems.
[48] Euijoon Ahn,et al. Unsupervised Feature Learning with K-means and An Ensemble of Deep Convolutional Neural Networks for Medical Image Classification , 2019, ArXiv.
[49] Mohammed Meknassi,et al. Enhancing unsupervised neural networks based text summarization with word embedding and ensemble learning , 2019, Expert Syst. Appl..
[50] Hongdong Li,et al. Deep Stacked Hierarchical Multi-Patch Network for Image Deblurring , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[51] Kai Feng,et al. SVM-based Deep Stacking Networks , 2019, AAAI.
[52] Horst Possegger,et al. HiBsteR: Hierarchical Boosted Deep Metric Learning for Image Retrieval , 2019, 2019 IEEE Winter Conference on Applications of Computer Vision (WACV).
[53] Cheng Ju,et al. Propensity score prediction for electronic healthcare databases using super learner and high-dimensional propensity score methods , 2017, Journal of applied statistics.
[54] Ji Feng,et al. Deep forest , 2017, IJCAI.
[55] Reem Bahgat,et al. UESTS: An Unsupervised Ensemble Semantic Textual Similarity Method , 2019, IEEE Access.
[56] Selim Akyokus,et al. Deep Learning- and Word Embedding-Based Heterogeneous Classifier Ensembles for Text Classification , 2018, Complex..
[57] Rinkle Rani,et al. BE-DTI': Ensemble framework for drug target interaction prediction using dimensionality reduction and active learning , 2018, Comput. Methods Programs Biomed..
[58] Zhiwei Xiong,et al. Deep Boosting for Image Denoising , 2018, ECCV.
[59] Zhen Cao,et al. The lncLocator: a subcellular localization predictor for long non‐coding RNAs based on a stacked ensemble classifier , 2018, Bioinform..
[60] Lior Rokach,et al. Ensemble learning: A survey , 2018, WIREs Data Mining Knowl. Discov..
[61] Andreas Nürnberger,et al. The Power of Ensembles for Active Learning in Image Classification , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[62] Guoyan Zheng,et al. Crowd Counting with Deep Negative Correlation Learning , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[63] A. Peters,et al. A Deep Learning Algorithm for Prediction of Age-Related Eye Disease Study Severity Scale for Age-Related Macular Degeneration from Color Fundus Photography. , 2018, Ophthalmology.
[64] Lei Cao,et al. Ensemble Network Architecture for Deep Reinforcement Learning , 2018 .
[65] Ponnuthurai N. Suganthan,et al. Ensemble incremental learning Random Vector Functional Link network for short-term electric load forecasting , 2018, Knowl. Based Syst..
[66] Zhibin Zhao,et al. Sparse Deep Stacking Network for Fault Diagnosis of Motor , 2018, IEEE Transactions on Industrial Informatics.
[67] Shanlin Yang,et al. Heterogeneous Ensemble for Default Prediction of Peer-to-Peer Lending in China , 2018, IEEE Access.
[68] Aníbal R. Figueiras-Vidal,et al. On building ensembles of stacked denoising auto-encoding classifiers and their further improvement , 2018, Inf. Fusion.
[69] K. Chau,et al. Novel genetic-based negative correlation learning for estimating soil temperature , 2018 .
[70] Gang Wang,et al. SSEL-ADE: A semi-supervised ensemble learning framework for extracting adverse drug events from social media , 2017, Artif. Intell. Medicine.
[71] John Langford,et al. Learning Deep ResNet Blocks Sequentially using Boosting Theory , 2017, ICML.
[72] Mark J. van der Laan,et al. The relative performance of ensemble methods with deep convolutional neural networks for image classification , 2017, Journal of applied statistics.
[73] Chang-Dong Wang,et al. Locally Weighted Ensemble Clustering , 2016, IEEE Transactions on Cybernetics.
[74] Hang Zhang,et al. Online Active Learning Paired Ensemble for Concept Drift and Class Imbalance , 2018, IEEE Access.
[75] Chee Peng Lim,et al. Credit Card Fraud Detection Using AdaBoost and Majority Voting , 2019, IEEE Access.
[76] Jian Yang,et al. Visual Representation and Classification by Learning Group Sparse Deep Stacking Network , 2018, IEEE Transactions on Image Processing.
[77] Horst Possegger,et al. BIER — Boosting Independent Embeddings Robustly , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).
[78] Bohyung Han,et al. BranchOut: Regularization for Online Ensemble Tracking with Convolutional Neural Networks , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[79] Iker Gondra,et al. MRI segmentation fusion for brain tumor detection , 2017, Inf. Fusion.
[80] Zhenchang Xing,et al. Ensemble application of convolutional and recurrent neural networks for multi-label text categorization , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).
[81] Ponnuthurai Nagaratnam Suganthan,et al. Empirical Mode Decomposition based ensemble deep learning for load demand time series forecasting , 2017, Appl. Soft Comput..
[82] Feng Duan,et al. Recognizing the Gradual Changes in sEMG Characteristics Based on Incremental Learning of Wavelet Neural Network Ensemble , 2017, IEEE Transactions on Industrial Electronics.
[83] Feng Xu,et al. A Flood Forecasting Model Based on Deep Learning Algorithm via Integrating Stacked Autoencoders with BP Neural Network , 2017, 2017 IEEE Third International Conference on Multimedia Big Data (BigMM).
[84] Shaolei Ren,et al. Online Learning for Offloading and Autoscaling in Energy Harvesting Mobile Edge Computing , 2017, IEEE Transactions on Cognitive Communications and Networking.
[85] Tatsuya Kawahara,et al. Semi-supervised ensemble DNN acoustic model training , 2017, 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[86] Zhong Yin,et al. Recognition of emotions using multimodal physiological signals and an ensemble deep learning model , 2017, Comput. Methods Programs Biomed..
[87] Verónica Bolón-Canedo,et al. Ensemble feature selection: Homogeneous and heterogeneous approaches , 2017, Knowl. Based Syst..
[88] Timo Aila,et al. Temporal Ensembling for Semi-Supervised Learning , 2016, ICLR.
[89] Mehryar Mohri,et al. AdaNet: Adaptive Structural Learning of Artificial Neural Networks , 2016, ICML.
[90] Bohyung Han,et al. BranchOut: Regularization for Online Ensemble Tracking with CNNs , 2017 .
[91] Yan Tong,et al. Incremental Boosting Convolutional Neural Network for Facial Action Unit Recognition , 2017, NIPS.
[92] Lu Sun,et al. Fast random k-labELsets for large-scale multi-label classification , 2016, 2016 23rd International Conference on Pattern Recognition (ICPR).
[93] Lior Wolf,et al. Learning to Count with CNN Boosting , 2016, ECCV.
[94] George D. Magoulas,et al. Deep Incremental Boosting , 2016, GCAI.
[95] Kai Keng Ang,et al. ieRSPOP: A novel incremental rough set-based pseudo outer-product with ensemble learning , 2016, Appl. Soft Comput..
[96] Ming Shao,et al. Infinite Ensemble for Image Clustering , 2016, KDD.
[97] Prabir Kumar Biswas,et al. Deep neural ensemble for retinal vessel segmentation in fundus images towards achieving label-free angiography , 2016, 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).
[98] Ying Ju,et al. Human Protein Subcellular Localization with Integrated Source and Multi-label Ensemble Classifier , 2016, Scientific Reports.
[99] Matthias Schmid,et al. A framework for parameter estimation and model selection in kernel deep stacking networks , 2016, Artif. Intell. Medicine.
[100] Serge J. Belongie,et al. Residual Networks Behave Like Ensembles of Relatively Shallow Networks , 2016, NIPS.
[101] David A. Forsyth,et al. Swapout: Learning an ensemble of deep architectures , 2016, NIPS.
[102] Kilian Q. Weinberger,et al. Deep Networks with Stochastic Depth , 2016, ECCV.
[103] Uri Shaham,et al. A Deep Learning Approach to Unsupervised Ensemble Learning , 2016, ICML.
[104] Chang-Dong Wang,et al. Ensemble clustering using factor graph , 2016, Pattern Recognit..
[105] Ponnuthurai N. Suganthan,et al. Ensemble Classification and Regression-Recent Developments, Applications and Future Directions [Review Article] , 2016, IEEE Computational Intelligence Magazine.
[106] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[107] Timothy Doster,et al. Gradual DropIn of Layers to Train Very Deep Neural Networks , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[108] Jian Yang,et al. Boosted Convolutional Neural Networks , 2016, BMVC.
[109] Anna Bou Ezzeddine,et al. Incremental Ensemble Learning for Electricity Load Forecasting , 2016 .
[110] Kai-Fu Tang,et al. Inquire and Diagnose : Neural Symptom Checking Ensemble using Deep Reinforcement Learning , 2016 .
[111] Guigang Zhang,et al. Deep Learning , 2016, Int. J. Semantic Comput..
[112] Feng Liu,et al. Predicting drug side effects by multi-label learning and ensemble learning , 2015, BMC Bioinformatics.
[113] Ponnuthurai Nagaratnam Suganthan,et al. Ensemble methods for wind and solar power forecasting—A state-of-the-art review , 2015 .
[114] Hongming Zhou,et al. Stacked Extreme Learning Machines , 2015, IEEE Transactions on Cybernetics.
[115] Junjie Wu,et al. Spectral Ensemble Clustering , 2015, KDD.
[116] Alagan Anpalagan,et al. Improved short-term load forecasting using bagged neural networks , 2015 .
[117] Jürgen Schmidhuber,et al. Training Very Deep Networks , 2015, NIPS.
[118] Haipeng Luo,et al. Online Gradient Boosting , 2015, NIPS.
[119] Geoffrey E. Hinton,et al. Deep Learning , 2015, Nature.
[120] Bin Yang,et al. Convolutional Channel Features , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).
[121] Geoffrey E. Hinton,et al. Distilling the Knowledge in a Neural Network , 2015, ArXiv.
[122] Jerzy Stefanowski,et al. Neighbourhood sampling in bagging for imbalanced data , 2015, Neurocomputing.
[123] Jian Yang,et al. Sparse Deep Stacking Network for Image Classification , 2015, AAAI.
[124] Yann LeCun,et al. The Loss Surfaces of Multilayer Networks , 2014, AISTATS.
[125] Dumitru Erhan,et al. Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[126] Andrew Zisserman,et al. Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.
[127] Michael S. Bernstein,et al. ImageNet Large Scale Visual Recognition Challenge , 2014, International Journal of Computer Vision.
[128] Pourya Shamsolmoali,et al. Application of Credit Card Fraud Detection: Based on Bagging Ensemble Classifier , 2015 .
[129] Mehryar Mohri,et al. Multi-Class Deep Boosting , 2014, NIPS.
[130] Graham J. Williams,et al. Big Data Opportunities and Challenges: Discussions from Data Analytics Perspectives [Discussion Forum] , 2014, IEEE Computational Intelligence Magazine.
[131] Li Deng,et al. Ensemble deep learning for speech recognition , 2014, INTERSPEECH.
[132] George Michailidis,et al. Critical limitations of consensus clustering in class discovery , 2014, Scientific Reports.
[133] Rabab Kreidieh Ward,et al. Recurrent Deep-Stacking Networks for sequence classification , 2014, 2014 IEEE China Summit & International Conference on Signal and Information Processing (ChinaSIP).
[134] Ping Liu,et al. Facial Expression Recognition via a Boosted Deep Belief Network , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.
[135] Mehryar Mohri,et al. Deep Boosting , 2014, ICML.
[136] Dong Yu,et al. Deep Learning: Methods and Applications , 2014, Found. Trends Signal Process..
[137] Steven Skiena,et al. DeepWalk: online learning of social representations , 2014, KDD.
[138] Le Zhang,et al. Ensemble deep learning for regression and time series forecasting , 2014, 2014 IEEE Symposium on Computational Intelligence in Ensemble Learning (CIEL).
[139] Michelangelo Ceci,et al. Integrating microRNA target predictions for the discovery of gene regulatory networks: a semi-supervised ensemble learning approach , 2014, BMC Bioinformatics.
[140] B S Weir,et al. HIBAG—HLA genotype imputation with attribute bagging , 2013, The Pharmacogenomics Journal.
[141] Jian Ma,et al. Sentiment classification: The contribution of ensemble learning , 2014, Decis. Support Syst..
[142] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[143] Alex Graves,et al. Playing Atari with Deep Reinforcement Learning , 2013, ArXiv.
[144] Dong Yu,et al. Tensor Deep Stacking Networks , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[145] Yann LeCun,et al. Regularization of Neural Networks using DropConnect , 2013, ICML.
[146] Chuang Zhang,et al. Horizontal and Vertical Ensemble with Deep Representation for Classification , 2013, ArXiv.
[147] Po-Sen Huang,et al. Random features for Kernel Deep Convex Network , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.
[148] Jianfeng Gao,et al. Deep stacking networks for information retrieval , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.
[149] Min Wu,et al. Multi-label ensemble based on variable pairwise constraint projection , 2013, Inf. Sci..
[150] Geoffrey E. Hinton,et al. ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.
[151] Gökhan Tür,et al. Use of kernel deep convex networks and end-to-end learning for spoken language understanding , 2012, 2012 IEEE Spoken Language Technology Workshop (SLT).
[152] Alípio Mário Jorge,et al. Ensemble approaches for regression: A survey , 2012, CSUR.
[153] Zhiwen Yu,et al. Transductive multi-label ensemble classification for protein function prediction , 2012, KDD.
[154] Jane You,et al. Semi-supervised ensemble classification in subspaces , 2012, Appl. Soft Comput..
[155] Vincent Pisetta,et al. New Insights into Decision Trees Ensembles , 2012 .
[156] Dong Yu,et al. A deep architecture with bilinear modeling of hidden representations: Applications to phonetic recognition , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[157] Gökhan Tür,et al. Towards deeper understanding: Deep convex networks for semantic utterance classification , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[158] Dong Yu,et al. Scalable stacking and learning for building deep architectures , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[159] Jürgen Schmidhuber,et al. Multi-column deep neural networks for image classification , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.
[160] Sandro Vega-Pons,et al. A Survey of Clustering Ensemble Algorithms , 2011, Int. J. Pattern Recognit. Artif. Intell..
[161] Philip S. Yu,et al. Multi-label Ensemble Learning , 2011, ECML/PKDD.
[162] Dong Yu,et al. Deep Convex Net: A Scalable Architecture for Speech Pattern Classification , 2011, INTERSPEECH.
[163] Geoff Holmes,et al. Classifier chains for multi-label classification , 2009, Machine Learning.
[164] Chris H. Q. Ding,et al. Hierarchical Ensemble Clustering , 2010, 2010 IEEE International Conference on Data Mining.
[165] Steffen Udluft,et al. Ensembles of Neural Networks for Robust Reinforcement Learning , 2010, 2010 Ninth International Conference on Machine Learning and Applications.
[166] Albert Y. Zomaya,et al. A Review of Ensemble Methods in Bioinformatics , 2010, Current Bioinformatics.
[167] Qiang-Li Zhao,et al. Incremental Learning by Heterogeneous Bagging Ensemble , 2010, ADMA.
[168] Oleksandr Makeyev,et al. Neural network with ensembles , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).
[169] Xin Yao,et al. The Impact of Diversity on Online Ensemble Learning in the Presence of Concept Drift , 2010, IEEE Transactions on Knowledge and Data Engineering.
[170] Yoshua Bengio,et al. Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.
[171] Xiaojin Zhu,et al. Semi-Supervised Learning , 2010, Encyclopedia of Machine Learning.
[172] Lior Rokach,et al. Ensemble-based classifiers , 2010, Artificial Intelligence Review.
[173] Saso Dzeroski,et al. Predicting gene function using hierarchical multi-label decision tree ensembles , 2010, BMC Bioinformatics.
[174] Xin Yao,et al. Selective negative correlation learning approach to incremental learning , 2009, Neurocomputing.
[175] Lawrence K. Saul,et al. Identifying suspicious URLs: an application of large-scale online learning , 2009, ICML '09.
[176] Zhi-Hua Zhou,et al. When semi-supervised learning meets ensemble learning , 2009, MCS.
[177] Grigorios Tsoumakas,et al. Pruning an ensemble of classifiers via reinforcement learning , 2009, Neurocomputing.
[178] O. Chapelle,et al. Semi-Supervised Learning (Chapelle, O. et al., Eds.; 2006) [Book reviews] , 2009, IEEE Transactions on Neural Networks.
[179] Robi Polikar,et al. Learn$^{++}$ .NC: Combining Ensemble of Classifiers With Dynamically Weighted Consult-and-Vote for Efficient Incremental Learning of New Classes , 2009, IEEE Transactions on Neural Networks.
[180] Hisashi Kashima,et al. Roughly balanced bagging for imbalanced data , 2009, Stat. Anal. Data Min..
[181] Marco Wiering,et al. Ensemble Algorithms in Reinforcement Learning , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).
[182] Chun-Xia Zhang,et al. RotBoost: A technique for combining Rotation Forest and AdaBoost , 2008, Pattern Recognit. Lett..
[183] Robi Polikar,et al. An ensemble based data fusion approach for early diagnosis of Alzheimer's disease , 2008, Inf. Fusion.
[184] Grigorios Tsoumakas,et al. Random k -Labelsets: An Ensemble Method for Multilabel Classification , 2007, ECML.
[185] Grigorios Tsoumakas,et al. Multi-Label Classification: An Overview , 2007, Int. J. Data Warehous. Min..
[186] Robi Polikar,et al. An Ensemble Approach for Incremental Learning in Nonstationary Environments , 2007, MCS.
[187] Wenjia Wang,et al. On diversity and accuracy of homogeneous and heterogeneous ensembles , 2007, Int. J. Hybrid Intell. Syst..
[188] M.A. Wiering,et al. Two Novel On-policy Reinforcement Learning Algorithms based on TD(λ)-methods , 2007, 2007 IEEE International Symposium on Approximate Dynamic Programming and Reinforcement Learning.
[189] Robi Polikar,et al. An Ensemble-Based Incremental Learning Approach to Data Fusion , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).
[190] M. J. van der Laan,et al. Statistical Applications in Genetics and Molecular Biology Super Learner , 2010 .
[191] Xuelong Li,et al. Asymmetric bagging and random subspace for support vector machines-based relevance feedback in image retrieval , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[192] Wei Tang,et al. Clusterer ensemble , 2006, Knowl. Based Syst..
[193] Peter Tiño,et al. Managing Diversity in Regression Ensembles , 2005, J. Mach. Learn. Res..
[194] Sungzoon Cho,et al. Response models based on bagging neural networks , 2005 .
[195] Jun Gao,et al. A survey of neural network ensembles , 2005, 2005 International Conference on Neural Networks and Brain.
[196] Xin Yao,et al. Diversity creation methods: a survey and categorisation , 2004, Inf. Fusion.
[197] E. M. Kleinberg,et al. Stochastic discrimination , 1990, Annals of Mathematics and Artificial Intelligence.
[198] Stuart J. Russell,et al. Online bagging and boosting , 2005, 2005 IEEE International Conference on Systems, Man and Cybernetics.
[199] Xiaojin Zhu,et al. Semi-Supervised Learning Literature Survey , 2005 .
[200] Richard S. Sutton,et al. Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.
[201] Raymond J. Mooney,et al. Diverse ensembles for active learning , 2004, ICML.
[202] Gareth James,et al. Variance and Bias for General Loss Functions , 2003, Machine Learning.
[203] Torsten Hothorn,et al. Bagging survival trees , 2002, Statistics in medicine.
[204] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[205] Leo Breiman,et al. Randomizing Outputs to Increase Prediction Accuracy , 2000, Machine Learning.
[206] Peter Dayan,et al. Q-learning , 1992, Machine Learning.
[207] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[208] Leo Breiman,et al. Stacked regressions , 2004, Machine Learning.
[209] Jerome H. Friedman,et al. On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality , 2004, Data Mining and Knowledge Discovery.
[210] Yann LeCun,et al. Large Scale Online Learning , 2003, NIPS.
[211] Raymond J. Mooney,et al. Constructing Diverse Classifier Ensembles using Artificial Training Examples , 2003, IJCAI.
[212] Robert P. W. Duin,et al. Limits on the majority vote accuracy in classifier fusion , 2003, Pattern Analysis & Applications.
[213] Hyun-Chul Kim,et al. Support Vector Machine Ensemble with Bagging , 2002, SVM.
[214] Ayhan Demiriz,et al. Exploiting unlabeled data in ensemble methods , 2002, KDD.
[215] Vasant Honavar,et al. Learn++: an incremental learning algorithm for supervised neural networks , 2001, IEEE Trans. Syst. Man Cybern. Part C.
[216] J. Friedman. Greedy function approximation: A gradient boosting machine. , 2001 .
[217] Min Qi,et al. Pricing and hedging derivative securities with neural networks: Bayesian regularization, early stopping, and bagging , 2001, IEEE Trans. Neural Networks.
[218] P. Bühlmann,et al. Analyzing Bagging , 2001 .
[219] Thomas G. Dietterich. Multiple Classifier Systems , 2000, Lecture Notes in Computer Science.
[220] Andreas Buja,et al. Smoothing Effects of Bagging , 2000 .
[221] Pedro M. Domingos. A Unifeid Bias-Variance Decomposition and its Applications , 2000, ICML.
[222] Xin Yao,et al. Ensemble learning via negative correlation , 1999, Neural Networks.
[223] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[224] L. Breiman. Arcing classifier (with discussion and a rejoinder by the author) , 1998 .
[225] Jianchang Mao,et al. A case study on bagging, boosting and basic ensembles of neural networks for OCR , 1998, 1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227).
[226] David H. Wolpert,et al. On Bias Plus Variance , 1997, Neural Computation.
[227] Yoav Freund,et al. Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.
[228] R. Tibshirani,et al. Combining Estimates in Regression and Classification , 1996 .
[229] Yoav Freund,et al. Experiments with a New Boosting Algorithm , 1996, ICML.
[230] Ron Kohavi,et al. Bias Plus Variance Decomposition for Zero-One Loss Functions , 1996, ICML.
[231] Leo Breiman,et al. Bias, Variance , And Arcing Classifiers , 1996 .
[232] Richard S. Sutton,et al. Generalization in Reinforcement Learning: Successful Examples Using Sparse Coarse Coding , 1995, NIPS.
[233] Thomas G. Dietterich,et al. Error-Correcting Output Coding Corrects Bias and Variance , 1995, ICML.
[234] Dejan J. Sobajic,et al. Learning and generalization characteristics of the random vector Functional-link net , 1994, Neurocomputing.
[235] L. Ryd,et al. On bias. , 1994, Acta orthopaedica Scandinavica.
[236] Mahesan Niranjan,et al. On-line Q-learning using connectionist systems , 1994 .
[237] Anders Krogh,et al. Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.
[238] David H. Wolpert,et al. Stacked generalization , 1992, Neural Networks.
[239] Elie Bienenstock,et al. Neural Networks and the Bias/Variance Dilemma , 1992, Neural Computation.
[240] Sepp Hochreiter,et al. Untersuchungen zu dynamischen neuronalen Netzen , 1991 .
[241] Lars Kai Hansen,et al. Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..