Back to Search
Start Over
Monotone Quantifiers Emerge via Iterated Learning
- Source :
- Cognitive Science, Cognitive Science, 45(8):e13027. Wiley-Blackwell
- Publication Year :
- 2021
-
Abstract
- Natural languages exhibit many semantic universals, that is, properties of meaning shared across all languages. In this paper, we develop an explanation of one very prominent semantic universal, the monotonicity universal. While the existing work has shown that quantifiers satisfying the monotonicity universal are easier to learn, we provide a more complete explanation by considering the emergence of quantifiers from the perspective of cultural evolution. In particular, we show that quantifiers satisfy the monotonicity universal evolve reliably in an iterated learning paradigm with neural networks as agents.
- Subjects :
- Semantic universals
Theoretical computer science
Computer science
Cognitive Neuroscience
Cultural evolution
Experimental and Cognitive Psychology
Monotonic function
Semantics
050105 experimental psychology
03 medical and health sciences
Meaning (philosophy of language)
0302 clinical medicine
Artificial Intelligence
Cultural Evolution
Humans
Learning
0501 psychology and cognitive sciences
Generalized quantifiers
Iterated learning
Neural networks
Language
Artificial neural network
05 social sciences
Regular Article
Problem of universals
Monotone polygon
Computer Science::Programming Languages
Neural Networks, Computer
030217 neurology & neurosurgery
Natural language
Linguistic universal
Regular Articles
Subjects
Details
- Language :
- English
- ISSN :
- 03640213
- Volume :
- 45
- Issue :
- 8
- Database :
- OpenAIRE
- Journal :
- Cognitive Science
- Accession number :
- edsair.doi.dedup.....5682d522e9d65df431f2b504e7179ea0
- Full Text :
- https://doi.org/10.1111/cogs.13027