A Model of User-Oriented Reduct Construction for Machine Learning

An implicit assumption of many machine learning algorithms is that all attributes are of the same importance. An algorithm typically selects attributes based solely on their statistical characteristics, without considering their semantic interpretations. In order to resolve difficulties associated with this unrealistic assumption, many researchers attempted to introduce user judgements of the importance of attributes into machine learning. However, there is still a lack of formal framework. Based on decision theory and measurement theory, a model of user-oriented reduct construction is proposed for machine learning by considering the user preference of attributes. It seamlessly combines internal information and external information. User preferences of attributes are extended to user preferences of attribute sets. Accordingly, user preferred reducts can be constructed.

[1]  王珏,et al.  Analysis on Attribute Reduction Strategies of Rough Set , 1998 .

[2]  Andrzej Skowron,et al.  Rough set methods in feature selection and recognition , 2003, Pattern Recognit. Lett..

[3]  Masahiro Inuiguchi Several Approaches to Attribute Reduction in Variable Precision Rough Set Model , 2005, MDAI.

[4]  Craig Boutilier,et al.  CP-nets: a tool for represent-ing and reasoning with conditional ceteris paribus state-ments , 2004 .

[5]  Yiyu Yao,et al.  A Measurement-Theoretic Foundation of Rule Interestingness Evaluation , 2006, Foundations and Novel Approaches in Data Mining.

[6]  Tsau Young Lin,et al.  Rough Set Methods and Applications , 2000 .

[7]  Jan G. Bazan,et al.  Rough set algorithms in classification problem , 2000 .

[8]  Dominik Ślęzak,et al.  Various approaches to reasoning with frequency based decision reducts: a survey , 2000 .

[9]  Malcolm J. Beynon,et al.  Reducts within the variable precision rough sets model: A further investigation , 2001, Eur. J. Oper. Res..

[10]  S. Tsumoto,et al.  Rough set methods and applications: new developments in knowledge discovery in information systems , 2000 .

[11]  Sadaaki Miyamoto,et al.  Rough Sets and Current Trends in Computing , 2012, Lecture Notes in Computer Science.

[12]  Anil K. Jain,et al.  Statistical Pattern Recognition: A Review , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Yongsheng Zhao,et al.  Finding the Reduct Subject to Preference Order of Attributes , 2007, RSEISP.

[14]  M. Kane Measurement theory. , 1980, NLN publications.

[15]  Pat Langley,et al.  Selection of Relevant Features and Examples in Machine Learning , 1997, Artif. Intell..

[16]  Kai Zhao,et al.  A reduction algorithm meeting users’ requirements , 2008, Journal of Computer Science and Technology.

[17]  Wang Na Research of optimal reduct under preference , 2005 .

[18]  Wen-Xiu Zhang,et al.  Knowledge reduction based on the equivalence relations defined on attribute set and its power set , 2007, Inf. Sci..

[19]  Wojciech Ziarko,et al.  Data mining tasks and methods: Rule discovery: rough set approaches for discovering rules and attribute dependencies , 2002 .

[20]  Duoqian Miao,et al.  Analysis on attribute reduction strategies of rough set , 1998, Journal of Computer Science and Technology.

[21]  R. Słowiński Intelligent Decision Support: Handbook of Applications and Advances of the Rough Sets Theory , 1992 .

[22]  Jerzy W. Grzymala-Busse,et al.  LERS-A System for Learning from Examples Based on Rough Sets , 1992, Intelligent Decision Support.

[23]  Hung Son Nguyen,et al.  Approximate Boolean Reasoning: Foundations and Applications in Data Mining , 2006, Trans. Rough Sets.

[24]  Yiyu Yao,et al.  On Reduct Construction Algorithms , 2006, RSKT.

[25]  Ming Zhang,et al.  Feature Selection with Adjustable Criteria , 2005, RSFDGrC.

[26]  Roman Słowiński,et al.  Intelligent Decision Support , 1992, Theory and Decision Library.

[27]  Peter C. Fishburn,et al.  Utility theory for decision making , 1970 .

[28]  Wang Ju,et al.  Reduction algorithms based on discernibility matrix: The ordered attributes method , 2001, Journal of Computer Science and Technology.

[29]  Salvatore Greco,et al.  Rough approximation of a preference relation by dominance relations , 1999, Eur. J. Oper. Res..

[30]  Jerzy W. Grzymala-Busse,et al.  Rough Sets , 1995, Commun. ACM.

[31]  Ron Kohavi,et al.  Wrappers for Feature Subset Selection , 1997, Artif. Intell..

[32]  Yiyu Yao,et al.  User-Oriented Feature Selection for Machine Learning , 2007, Comput. J..

[33]  Feifei Xu,et al.  An Approach for Fuzzy-Rough Sets Attributes Reduction via Mutual Information , 2007, Fourth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD 2007).

[34]  Wei-Zhi Wu,et al.  Knowledge Reduction in Incomplete Information Systems Based on Dempster-Shafer Theory of Evidence , 2006, RSKT.

[35]  Michael W. Fleming,et al.  Reasoning with Conditional Preferences Across Attributes , 2007, Canadian Conference on AI.

[36]  Salvatore Greco,et al.  Mining decision-rule preference model from rough approximation of preference relation , 2002, Proceedings 26th Annual International Computer Software and Applications.

[37]  Yiyu Yao,et al.  A Model of Machine Learning Based on User Preference of Attributes , 2006, RSCTC.

[38]  Jue Wang,et al.  Reduct and attribute order , 2008, Journal of Computer Science and Technology.

[39]  Andrzej Skowron,et al.  The Discernibility Matrices and Functions in Information Systems , 1992, Intelligent Decision Support.

[40]  Wang Jue,et al.  A reduction algorithm meeting users' requirements , 2002 .

[41]  Yiyu Yao,et al.  A General Definition of an Attribute Reduct , 2007, RSKT.