The concept of jumping emerging patterns (JEPs) has been proposed to describe those discriminating features which only occur in the positive training instances but do not occur in the negative class at all; JEPs have been used to construct classifiers which generally provide better accuracy than the state-of-the-art classifiers such as C4.5. The algorithms for maintaining the space of jumping emerging patterns (JEP space) are presented in this paper. We prove that JEP spaces satisfy the property of convexity. Therefore JEP spaces can be concisely represented by two bounds: consisting respectively of the most general elements and the most specific elements. In response to insertion of new training instances, a JEP space is modified by operating on its boundary elements and the boundary elements of the JEP spaces associated with the new instances. This strategy completely avoids the need to go back to the most initial step to build the new JEP space. In addition, our maintenance algorithms can well handle such other cases as deletion of instances, insertion of new attributes, and deletion of attributes.
[1]
Jinyan Li,et al.
CAEP: Classification by Aggregating Emerging Patterns
,
1999,
Discovery Science.
[2]
Catherine Blake,et al.
UCI Repository of machine learning databases
,
1998
.
[3]
Nils J. Nilsson,et al.
MLC++, A Machine Learning Library in C++.
,
1995
.
[4]
D. J. Newman,et al.
UCI Repository of Machine Learning Database
,
1998
.
[5]
Tom M. Mitchell,et al.
Version Spaces: A Candidate Elimination Approach to Rule Learning
,
1977,
IJCAI.
[6]
J. Ross Quinlan,et al.
C4.5: Programs for Machine Learning
,
1992
.
[7]
Tom M. Mitchell,et al.
Generalization as Search
,
2002
.
[8]
Douglas H. Fisher,et al.
A Case Study of Incremental Concept Induction
,
1986,
AAAI.
[9]
Devika Subramanian,et al.
The Common Order-Theoretic Structure of Version Spaces and ATMSs
,
1991,
Artif. Intell..
[10]
Jinyan Li,et al.
Efficient mining of emerging patterns: discovering trends and differences
,
1999,
KDD '99.