Back to Search
Start Over
Meta-interpretive learning: application to grammatical inference
- Source :
- Machine Learning. 94:25-49
- Publication Year :
- 2013
- Publisher :
- Springer Science and Business Media LLC, 2013.
-
Abstract
- Despite early interest Predicate Invention has lately been under-explored within ILP. We develop a framework in which predicate invention and recursive generalisations are implemented using abduction with respect to a meta-interpreter. The approach is based on a previously unexplored case of Inverse Entailment for Grammatical Inference of Regular languages. Every abduced grammar H is represented by a conjunction of existentially quantified atomic formulae. Thus ¬H is a universally quantified clause representing a denial. The hypothesis space of solutions for ¬H can be ordered by ?-subsumption. We show that the representation can be mapped to a fragment of Higher-Order Datalog in which atomic formulae in H are projections of first-order definite clause grammar rules and the existentially quantified variables are projections of first-order predicate symbols. This allows predicate invention to be effected by the introduction of first-order variables. Previous work by Inoue and Furukawa used abduction and meta-level reasoning to invent predicates representing propositions. By contrast, the present paper uses abduction with a meta-interpretive framework to invent relations. We describe the implementations of Meta-interpretive Learning (MIL) using two different declarative representations: Prolog and Answer Set Programming (ASP). We compare these implementations against a state-of-the-art ILP system MC-TopLog using the dataset of learning Regular and Context-Free grammars as well learning a simplified natural language grammar and a grammatical description of a staircase. Experiments indicate that on randomly chosen grammars, the two implementations have significantly higher accuracies than MC-TopLog. In terms of running time, Metagol is overall fastest in these tasks. Experiments indicate that the Prolog implementation is competitive with the ASP one due to its ability to encode a strong procedural bias. We demonstrate that MIL can be applied to learning natural grammars. In this case experiments indicate that increasing the available background knowledge, reduces the running time. Additionally ASPM (ASP using a meta-interpreter) is shown to have a speed advantage over Metagol when background knowledge is sparse. We also demonstrate that by combining MetagolR (Metagol with a Regular grammar meta-interpreter) and MetagolCF (Context-Free meta-interpreter) we can formulate a system, MetagolRCF, which can change representation by firstly assuming the target to be Regular, and then failing this, switch to assuming it to be Context-Free. MetagolRCF runs up to 100 times faster than MetagolCF on grammars chosen randomly from Regular and non-Regular Context-Free grammars.
- Subjects :
- Theoretical computer science
business.industry
Computer science
computer.software_genre
Grammar induction
Predicate (grammar)
Datalog
Tree-adjoining grammar
Prolog
TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES
Regular language
Artificial Intelligence
Definite clause grammar
Artificial intelligence
Regular grammar
business
computer
Software
Natural language processing
computer.programming_language
Subjects
Details
- ISSN :
- 15730565 and 08856125
- Volume :
- 94
- Database :
- OpenAIRE
- Journal :
- Machine Learning
- Accession number :
- edsair.doi...........41a4bc07f2438b2500cf0b37be04d6c6