Active Automata Learning as Black-Box Search and Lazy Partition Refinement

We present a unifying formalization of active automata learning algorithms in the MAT model, including a new, efficient, and simple technique for the analysis of counterexamples during learning: Lλ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepacka...

Full description

Saved in:
Bibliographic Details
Published inA Journey from Process Algebra Via Timed Automata to Model Learning Vol. 13560; pp. 321 - 338
Main Authors Howar, Falk, Steffen, Bernhard
Format Book Chapter
LanguageEnglish
Published Switzerland Springer 2022
Springer Nature Switzerland
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present a unifying formalization of active automata learning algorithms in the MAT model, including a new, efficient, and simple technique for the analysis of counterexamples during learning: Lλ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$L^{\!\lambda }$$\end{document} is the first active automata learning algorithm that does not add sub-strings of counterexamples to the underlying data structure for observations but instead performs black-box search and partition refinement. We analyze the worst case complexity in terms of membership queries and equivalence queries and evaluate the presented learning algorithm on benchmark instances from the Automata Wiki, comparing its performance against efficient implementations of some learning algorithms from LearnLib.
ISBN:9783031156281
3031156285
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-031-15629-8_17