Back to Search
Start Over
Acquiring language from speech by learning to remember and predict
- Source :
- CoNLL
- Publication Year :
- 2020
- Publisher :
- Association for Computational Linguistics, 2020.
-
Abstract
- Classical accounts of child language learning invoke memory limits as a pressure to discover sparse, language-like representations of speech, while more recent proposals stress the importance of prediction for language learning. In this study, we propose a broad-coverage unsupervised neural network model to test memory and prediction as sources of signal by which children might acquire language directly from the perceptual stream. Our model embodies several likely properties of real-time human cognition: it is strictly incremental, it encodes speech into hierarchically organized labeled segments, it allows interactive top-down and bottom-up information flow, it attempts to model its own sequence of latent representations, and its objective function only recruits local signals that are plausibly supported by human working memory capacity. We show that much phonemic structure is learnable from unlabeled speech on the basis of these local signals. We further show that remembering the past and predicting the future both contribute to the linguistic content of acquired representations, and that these contributions are at least partially complementary.
- Subjects :
- Artificial neural network
Working memory
Computer science
business.industry
media_common.quotation_subject
05 social sciences
Acquiring language
Information flow
Cognition
Language acquisition
computer.software_genre
050105 experimental psychology
03 medical and health sciences
0302 clinical medicine
Perception
Stress (linguistics)
0501 psychology and cognitive sciences
Artificial intelligence
business
computer
030217 neurology & neurosurgery
Natural language processing
media_common
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- Proceedings of the 24th Conference on Computational Natural Language Learning
- Accession number :
- edsair.doi...........9288d15bc2797181a462a81bb0d21420
- Full Text :
- https://doi.org/10.18653/v1/2020.conll-1.15