Improving surgical models through one/two class learning

Only a minority of patients undergoing in-patient surgical procedures experience complications. However, the large number of in-patient surgeries (over 48 million procedures each year in the U.S.) results in substantial overall mortality and morbidity due to these complications. This burden can be decreased through improvements in the ability to evaluate patients by the bedside, and to assess surgical quality and out-comes across hospitals. Unfortunately, the process of developing clinical models for surgical complications is made challenging by the availability of generally small datasets for model training, and by class imbalance due to the diminished prevalence of many important complications. In this paper, we address this issue and explore the idea of jointly leveraging the benefits of both supervised and unsupervised learning to model surgical complications that occur infrequently. In particular, we study an approach where the problems of supervised and unsupervised model development are treated as tasks that can be transferred. Focussing this work on support vector machine (SVM) classification, we describe a transfer learning algorithm that improves performance relative to both supervised (i.e., binary or 2-class SVM) and unsupervised (i.e., 1-class SVM) methods, as well as the use of cost-sensitive weighting techniques, for predicting different surgical complications within the American College of Surgeons National Surgical Quality Improvement Program registry.