Effective and easy access to stored information is a fundamental computer users' need. Advancements in storage technology and ubiquity of digital gadgets enabled users to generate tons of information. Ultimately, raise the demand for efficient schemes to search and retrieve the stored information. Desktop search engines (DSEs) are developed for helping users in finding information on their desktops quickly and easily. However, the availability of several DSEs troubled users in their selections for making their entire experience of using personal computers less frustrating. This paper introduces a novel systematic approach for performances evaluations of DSEs by leveraging the scientific methods from information retrieval systems. The proposed methodology is based on Cranfield approach consisting of setting several null hypotheses (H_0), developing of static documents and queries collections, and using scientific measures for relevancy measurements. Statistical analysis is performed to confirm significance of the hypotheses and measurements. Copernic and Google are selected as case tools for executing and confirm validity of the methodology. The methodology is generic and can be applied for performances evaluations of any DSE. Results obtained from the test exercises indicated that the methodology can provide detailed, accurate, and useful information for performance evaluation of any DSE.