Back to Search
Start Over
Robust models to infer flexible nondeterministic finite automata.
- Source :
- Journal of Computational Science; Jul2024, Vol. 79, pN.PAG-N.PAG, 1p
- Publication Year :
- 2024
-
Abstract
- Grammatical inference involves learning a formal grammar as a finite state machine or a set of rewrite rules. This paper focuses on inferring nondeterministic finite automata (NFA) from a given sample of words: the NFA must accept some words, and reject the others. To address this task we construct several over-constrained inference models capable of finding NFA of size k + 1 , which are directly convertible to NFA of size k. Additionally, we propose an NFA evaluation method based on random walks along with two methods used to enhance the acceptance rates of the inferred NFA. The effectiveness of our approach is demonstrated through the results of comprehensive experiments conducted on several benchmark sets. This paper is an extended version of the ICCS 2023 paper entitled "Inference of over-constrained NFA of size k + 1 to efficiently and systematically derive NFA of size k for grammar learning" (Jastrzab et al., 2023 [1]). • Over-constrained models for k + 1 NFA enable fast and systematic derivation of k NFA. • Experiments indicate that our models help to reduce the number of unsolved instances. • Inferred NFA can be post-processed to affect global accepting/rejecting capabilities. [ABSTRACT FROM AUTHOR]
- Subjects :
- FINITE state machines
RANDOM walks
Subjects
Details
- Language :
- English
- ISSN :
- 18777503
- Volume :
- 79
- Database :
- Supplemental Index
- Journal :
- Journal of Computational Science
- Publication Type :
- Periodical
- Accession number :
- 177352235
- Full Text :
- https://doi.org/10.1016/j.jocs.2024.102309