Dissertation/Thesis Abstract

An analysis of a model-based evolutionary algorithm: Learnable Evolution Model
by Coletti, Mark, Ph.D., George Mason University, 2014, 247; 3625081
Abstract (Summary)

An evolutionary algorithm (EA) is a biologically inspired metaheuristic that uses mutation, crossover, reproduction, and selection operators to evolve solutions for a given problem. Learnable Evolution Model (LEM) is an EA that has an evolutionary algorithm component that works in tandem with a machine learner to collaboratively create populations of individuals. The machine learner infers rules from best and least fit individuals, and then this knowledge is exploited to improve the quality of offspring.

Unfortunately, most of the extant work on LEM has been ad hoc , and so there does not exist a deep understanding of how LEM works. And this lack of understanding, in turn, means that there is no set of best practices for implementing LEM. For example, most LEM implementations use rules that describe value ranges corresponding to areas of higher fitness in which offspring should be created. However, we do not know the efficacy of different approaches for sampling those intervals. Also, we do not have sufficient guidance for assembling training sets of positive and negative examples from populations from which the ML component can learn.

This research addresses those open issues by exploring three different rule interval sampling approaches as well as three different training set configurations on a number of test problems that are representative of the types of problems that practitioners may encounter. Using the machine learner to create offspring induces a unique emergent selection pressure separate from the selection pressure that manifests from parent and survivor selection; an outcome of this research is a partially ordered set of the impact that these rule interval sampling approaches and training set configurations have on this selection pressure that practitioners can use for implementation guidance. That is, a practitioner can modulate selection pressure by traversing a set of design configurations within a Hasse graph defined by partially ordered selection pressure.

Supplemental Files

Some files may require a special program or browser plug-in. More Information

Indexing (document details)
Advisor: De Jong, Kenneth
Commitee: Arciszewski, Thomasz, Domeniconi, Carlotta, Luke, Sean
School: George Mason University
Department: Computer Science
School Location: United States -- Virginia
Source: DAI-B 75/10(E), Dissertation Abstracts International
Source Type: DISSERTATION
Subjects: Artificial intelligence
Keywords: Artificial intelligence, Evolutionary computation, Learnable Evolution Model, Machine learning
Publication Number: 3625081
ISBN: 9781303995125
Copyright © 2019 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy
ProQuest