Introduction to learning classifier systems
This is an accessible introduction to Learning Classifier Systems (LCS) for undergraduate and postgraduate students, data analysts, and machine learning practitioners.
Saved in:
Main Authors | , |
---|---|
Format | eBook Book |
Language | English |
Published |
Berlin, Heidelberg
Springer
2017
Springer Berlin / Heidelberg Springer Berlin Heidelberg |
Edition | 1 |
Series | SpringerBriefs in Intelligent Systems |
Subjects | |
Online Access | Get full text |
ISBN | 9783662550069 3662550067 |
ISSN | 2196-548X 2196-5498 |
DOI | 10.1007/978-3-662-55007-6 |
Cover
Table of Contents:
- Intro -- Preface -- Contents -- Acronyms and Glossary -- 1 LCSs in a Nutshell -- Abstract -- 1.1 A Non-trivial Example Problem: The Multiplexer -- 1.2 Key Elements -- 1.2.1 Environment -- 1.2.2 Rules, Matching, and Classifiers -- 1.2.3 Discovery Component - Evolutionary Computation -- 1.2.4 Learning Component -- 1.3 LCS Functional Cycle -- 1.4 Post-training -- 1.4.1 Rule Compaction -- 1.4.2 Prediction -- 1.4.3 Evaluation -- 1.4.3.1 Training & -- Testing Performance -- 1.4.3.2 Significance of Performance -- 1.4.4 Interpretation -- 1.5 Code Exercises (eLCS) -- 2 LCS Concepts -- Abstract -- 2.1 Learning -- 2.1.1 Modeling with a Ruleset -- 2.2 Classifier -- 2.2.1 Rules -- 2.2.1.1 Rule Worth -- 2.2.1.2 Rules Versus Classifiers -- 2.2.1.3 Niche -- 2.2.2 Representation and Alphabet -- 2.2.3 Generalisation -- 2.2.3.1 Don't Care '#' Operator -- 2.2.3.2 Overgeneral Rules -- 2.2.3.3 Overspecific Rules -- 2.2.3.4 Maximally General, Accurate Rules -- 2.3 System -- 2.3.1 Interaction with Problems -- 2.3.1.1 Environment Properties -- 2.3.1.2 Learning, Adaptive, and Cognitive Systems -- 2.3.1.3 Evaluating Rules -- 2.3.2 Cooperation of Classifiers -- 2.3.3 Competition Between Classifiers -- 2.4 Problem Properties -- 2.4.1 Problem Complexity -- 2.4.1.1 Size of Search Space -- 2.4.1.2 Redundancy and Irrelevance -- 2.4.1.3 Epistasis -- 2.4.1.4 Heterogeneity -- 2.4.2 Applications Overview -- 2.5 Advantages -- 2.6 Disadvantages -- 3 Functional Cycle Components -- Abstract -- 3.1 Evolutionary Computation and LCSs -- 3.2 Initial Considerations -- 3.3 Basic Alphabets for Rule Representation -- 3.3.1 Encoding for Binary Alphabets -- 3.3.2 Interval-Based -- 3.3.2.1 Hyperalphabets -- 3.3.2.2 Mixed Representations -- 3.4 Matching -- 3.5 Covering -- 3.6 Form a Correct Set or Select an Action -- 3.6.1 Explore vs. Exploit -- 3.6.1.1 Local Optima -- 3.6.2 Action Selection
- 3.7 Performing the Action -- 3.8 Update -- 3.8.1 Numerosity of Rules -- 3.8.2 Fitness Sharing -- 3.9 Selection for Rule Discovery -- 3.9.1 Parent Selection Methods -- 3.9.1.1 Roulette Wheel Selection -- 3.9.1.2 Tournament Selection -- 3.10 Rule Discovery -- 3.10.1 When to Invoke Rule Discovery -- 3.10.2 Identifying Building Blocks of Knowledge -- 3.10.3 Mutation -- 3.10.4 Crossover -- 3.10.4.1 Single-Point, Two-Point, or Uniform Crossover -- 3.10.5 Initialising Offspring Classifiers -- 3.10.6 Other Rule Discovery -- 3.11 Subsumption -- 3.12 Deletion -- 3.13 Summary -- 4 LCS Adaptability -- Abstract -- 4.1 LCS Pressures -- 4.2 Michigan-Style vs. Pittsburgh-Style LCSs -- 4.3 Michigan-Style Approaches -- 4.3.1 Michigan-Style Supervised Learning (UCS) -- 4.3.2 Updates with Time-Weighted Recency Averages -- 4.3.3 Michigan-Style Reinforcement Learning (e.g. XCS) -- 4.3.3.1 XCS -- 4.3.3.2 Zeroth-Level Classifier System (ZCS) -- 4.3.3.3 Older Michigan-Style LCSs -- 4.3.3.4 ExSTraCS -- 4.4 Pittsburgh-Style Approaches -- 4.4.1 GAssist and BioHEL -- 4.4.2 GABIL, GALE, and A-PLUS -- 4.5 Strength- vs. Accuracy-Based Fitness -- 4.5.1 Strength-Based -- 4.5.2 Accuracy-Based -- 4.6 Niche-Based Rule Discovery -- 4.7 Single- vs. Multi-step Learning -- 4.7.1 Sense, Plan, Act -- 4.7.2 Delayed Reward -- 4.7.2.1 Q-Learning-Based Updates (e.g. Multi-step XCS) -- 4.7.2.2 Discounted Reward -- 4.7.3 Anticipatory Classifier Systems -- 4.8 Computed Alphabets -- 4.8.1 S-Expression and Genetic Programming -- 4.8.2 Artificial Neural Networks -- 4.8.3 Computed Prediction -- 4.8.4 Computed Action -- 4.8.5 Code Fragments -- 4.9 Environment Considerations -- 5 Applying LCSs -- Abstract -- 5.1 LCS Setup -- 5.1.1 Run Parameter 'Sweet Spots' -- 5.1.1.1 Learning Bounds -- 5.1.2 Hybridise or Die -- 5.2 Tuning -- 5.3 Troubleshooting -- 5.3.1 Lack of Convergence -- 5.4 Where to Now?
- 5.4.1 Workshops and Conferences -- 5.4.2 Books, Journals, and Select Reviews -- 5.4.3 Websites and Software -- 5.4.4 Collaborate -- 5.5 Concluding Remarks