skip to main content
An anytime approach to connectionist theory refinement: refining the topologies of knowledge-based neural networks
Publisher:
  • University of Wisconsin at Madison
  • Engineering Experiment Station Madison, WI
  • United States
Order Number:UMI Order No. GAX95-37146
Bibliometrics
Skip Abstract Section
Abstract

Many scientific and industrial problems can be better understood by learning from samples of the task at hand. For this reason, the machine learning and statistics communities devote considerable research effort on generating inductive-learning algorithms that try to learn the true "concept" of a task from a set of its examples. Often times, however, one has additional resources readily available, but largely unused, that can improve the concept that these learning algorithms generate. These resources include available computer cycles, as well as prior knowledge describing what is currently known about the domain. Effective utilization of available computer time is important since for most domains an expert is willing to wait for weeks, or even months, if a learning system can produce an improved concept. Using prior knowledge is important since it can contain information not present in the current set of training examples.In this thesis, I present three "anytime" approaches to connectionist theory refinement. Briefly, these approaches start by translating a set of rules describing what is currently known about the domain into a neural network, thus generating a knowledge-based neural network (KNN). My approaches then utilize available computer time to improve this KNN by continually refining its weights and topology. My first method, TopGen, searches for good "local" refinements to the KNN topology. It does this by adding nodes to the KNN in a manner analogous to symbolically adding rules and conjuncts to an incorrect rule base. My next approach, R scEGENT, uses genetic algorithms to find better "global" changes to this topology. R scEGENT proceeds by using (a) the domain-specific rules to help create the initial population of KNNs and (b) crossover and mutation operators specifically designed for KNNs. My final algorithm, A scDDEMUP, searches for an "ensemble" of KNNs that work together to produce an effective composite prediction. A scDDEMUP works by using genetic algorithms to continually create new networks, keeping the set of networks that are as accurate as possible while disagreeing with each other as much as possible. Empirical results show that these algorithms successfully achieve each of their respective goals.

Contributors
  • University of Montana

Index Terms

  1. An anytime approach to connectionist theory refinement: refining the topologies of knowledge-based neural networks

    Recommendations