A Comparative Evaluation of “m-ACO” Technique for Test Suite Prioritization

Objectives: The novel test case prioritization technique “m-ACO” (“Modified Ant Colony Optimization”) for regression testing has been comparatively evaluated. Methods: “m-ACO” prioritize the test cases by altering the food source selection criteria of natural ants to enhance fault diversity. The code for the proposed technique for prioritizing test case “m-ACO” has been implemented in Perl language. This paper makes a comparative evaluation of proposed “m-ACO” technique for prioritization of test cases with GA (“Genetic Algorithm”), BCO (“Bee Colony Optimization”) Algorithms and ACO (“Ant Colony Optimization”) Algorithms using three case studies. Two metrics namely APFD (“Average Percentage of Faults Detected”) and PTR (“Percentage of Test Suite Required for Complete Fault Coverage”) have been used to measure the effectiveness of the proposed “m-ACO” technique. Findings: The proposed technique “m-ACO” produced optimal or near optimal solutions. The proposed “m-ACO” technique proves its efficiency in comparison to GA, BCO and ACO methods individually. Improvements: The proposed technique improves the ACO method by altering food source selection criteria of natural ants. The future work in this direction will comparatively evaluate the proposed “m-ACO” technique using some well known software testing problems and open source software. An automated tool for the proposed technique is being developed.