skip to main content
10.5555/1082222.1082228dlproceedingsArticle/Chapter ViewAbstractPublication PagesadcConference Proceedingsconference-collections
Article
Free Access

Using association rules to make rule-based classifiers robust

Authors Info & Claims
Published:30 January 2005Publication History

ABSTRACT

Rule-based classification systems have been widely used in real world applications because of the easy interpretability of rules. Many traditional rule-based classifiers prefer small rule sets to large rule sets, but small classifiers are sensitive to the missing values in unseen test data. In this paper, we present a larger classifier that is less sensitive to the missing values in unseen test data. We experimentally show that it is more accurate than some benchmark classifies when unseen test data have missing values.

References

  1. Agrawal, R. & Srikant, R. (1994), Fast algorithms for mining association rules in large databases, in 'Proceedings of the Twentieth International Conference on Very Large Databases', Santiago, Chile, pp. 487--499. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Batista, G. E. A. P. A. & Monard, M. C. (2003), 'An analysis of four missing data treatment methods for supervised learning', Applied Artificial Intelligence17(5--6), 519--533.Google ScholarGoogle ScholarCross RefCross Ref
  3. Bayardo, R. & Agrawal, R. (1999), Mining the most interesting rules, in 'Proceedings of the Fifth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining', ACM Press, N.Y., pp. 145--154. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bayardo, R., Agrawal, R. & Gunopulos, D. (2000), 'Constraint-based rule mining in large, dense database', Data Mining and Knowledge Discovery Journal4(2/3), 217--240. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Blake, E. K. C. & Merz, C. J. (1998), 'UCI repository of machine learning databases, http://www.ics.uci.edu/~mlearn/MLRepository.html'.Google ScholarGoogle Scholar
  6. Breiman, L. (1996), 'Bagging predictors', Machine Learning24, 123--140. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Clark, P. & Boswell, R. (1991), Rule induction with CN2: Some recent improvements, in 'Machine Learning - EWSL-91', pp. 151--163. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Clark, P. & Niblett, T. (1989), 'The CN2 induction algorithm', Machine Learning3(4), 261--283. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Freund, Y. & Schapire, R. E. (1996), Experiments with a new boosting algorithm, in 'International Conference on Machine Learning', pp. 148--156. *citeseer.nj.nec.com/freund96experiments.htmlGoogle ScholarGoogle Scholar
  10. Freund, Y. & Schapire, R. E. (1997), 'A decision-theoretic generalization of on-line learning and an application to boosting', Journal of Computer and System Sciences55(1), 119--139. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Han, J., Pei, J. & Yin, Y. (2000), Mining frequent patterns without candidate generation, in 'Proc. 2000 ACM-SIGMOD Int. Conf. on Management of Data (SIGMOD'00)', May, pp. 1--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Li, J., Shen, H. & Topor, R. (2002), 'Mining the optimal class association rule set', Knowledge-Based System15(7), 399--405.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Li, J., Topor, R. & Shen, H. (2002), Construct robust rule sets for classification, in 'Proceedings of the eighth ACMKDD international conference on knowledge discovery and data mining', ACM press, Edmonton, Canada, pp. 564 -- 569. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Li, W., Han, J. & Pei, J. (2001), CMAR: Accurate and efficient classification based on multiple class-association rules, in 'Proceedings 2001 IEEE International Conference on Data Mining (ICDM 2001)', IEEE Computer Society Press, pp. 369--376. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Liu, B., Hsu, W. & Ma, Y. (1998), Integrating classification and association rule mining, in 'Proceedings of the Fourth International Conference on Knowledge Discovery and Data Mining (KDD-98)', pp. 27--31.Google ScholarGoogle Scholar
  16. Michalski, R., Mozetic, I., Hong, J. & Lavrac, N. (1986), The AQ15 inductive learning system: an overview and experiments, in 'Proceedings of IMAL 1986', Université de Paris-Sud, Orsay.Google ScholarGoogle Scholar
  17. Mingers, J. (1989), 'An empirical comparison of selection measures for decision tree induction', Machine Learning3, 319--342. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Mitchell, T. M. (1997), Machine Learning, McGraw-Hill. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Quinlan, J. R. (1993), C4.5: Programs for Machine Learning, Morgan Kaufmann, San Mateo, CA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Rissanen, J. (1983), 'A universal prior for the integers and estimation by MDL', Ann. of Statistics11(2), 416--431.Google ScholarGoogle Scholar
  21. Yin, X. & Han, J. (2003), CPAR: Classification based on predictive association rules, in 'Proceedings of 2003 SIAM International Conference on Data Mining (SDM'03)'.Google ScholarGoogle Scholar

Index Terms

  1. Using association rules to make rule-based classifiers robust

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image DL Hosted proceedings
            ADC '05: Proceedings of the 16th Australasian database conference - Volume 39
            January 2005
            180 pages
            ISBN:192068221X

            Publisher

            Australian Computer Society, Inc.

            Australia

            Publication History

            • Published: 30 January 2005

            Qualifiers

            • Article

            Acceptance Rates

            Overall Acceptance Rate98of224submissions,44%

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader