Parameter Tuning on Software Defect Prediction Using Differential Evolution & Simulated Annealing

Machine Learning algorithms are used in Software Engineering to predict defects. These defect predictors are powerful in comparison to manual methods. They are also pretty simple to grasp and use. But an important thing, which is often ignored is the tuning of these defect predictors to optimize their performance. We try to find simple and easy to implement methods for tuning the defect predictors and also compare the performances of these methods. We ran Differential Evolution and Simulated Annealing as optimizers using different datasets from open-source JAVA systems to explore the tuning space. Finally, we tested the tunings and compared the results obtained from both the methods. We found that tuning improved the performance in majority of cases. It was also found that not all optimisation algorithms used for tuning produced the same results. As (1) there is significant improvement in performance after parameter tuning, there is a need to change standard methods used in software analytics. It is not sufficient to present the result without performing a proper tuning optimization study, especially in the case of defect prediction. (2) Differential Evolution and Simulated Annealing didn’t give similar results for majority of the datasets, so it is necessary to perform tuning using different optimization algorithms to obtain the best possible results.